AzDO Pipelines
Version-control & CI/CD your Power BI reports: Using DevOps Git Repos and Pipelines
I'll show you how to orchestrated thru Azure DevOps pipelines a fully functional CI/CD cycle for your Power BI reports.
AzDO Pipelines
I'll show you how to orchestrated thru Azure DevOps pipelines a fully functional CI/CD cycle for your Power BI reports.
Power BI
Git integration empowers you to streamline your development workflow, collaborate and maintain version control for your Power BI reports within the Fabric ecosystem; with deployment pipelines, you can define a series of steps that move your report from development to production environments.
No more PBIX files flying around! Control your Power BI reports using modern software development practices... I'll show you how to use two new features in the Premium pricing tier: Fabric Git-integration for versioning and Power BI Deployment pipelines for faster delivery of content updates
AzDO Pipelines
Learn to create a Visual Studio Database project, host it in a DevOps Git repo, and deploy it using Azure DevOps pipelines.
ADF simplifies the orchestration of Azure Databricks notebooks, streamlining your data workflows and ensuring efficient data processing.
Learn how to use DevOps YAML pipelines to create artifacts from collections of Python files and effortlessly deploy them to Databricks workspaces. Gain insights into optimizing your workflow for efficiency and reliability, ensuring smooth transitions between environments
Azure Databricks SCIM allows you to sync users and groups from Azure Active Directory to Azure Databricks ensuring consistent access control across platforms and simplifying user provisioning and deprovisioning processes
AzDO Pipelines
Discover the power of BICEP in provisioning a robust Databricks workspace infrastructure, including the Azure Data Factory, Key Vault and Storage account; I'll use some PowerShell to configure RBAC permissions between resources and I'll show you how use the logging command task.setvariable
Databricks
On this series I'm going to show you how to provision your Databricks infrastructure with BICEP and to connect your workspace to Azure's Entra ID to manage users & groups. Furthermore, I'll show you how to deploy your notebooks across Environments with yaml pipelines and orchestrate with ADF
AzDO Pipelines
This would be our final leap forward where I will divide ADF components into two groups two groups: Core and Data-Product; I'll explain why I think this is a good approach for enterprise scale setups where you might have a team dedicated to provide infrastructure services.
AzDO Pipelines
This would be a leap forward from publishing a Full ARM into a Selective Deployment of Components, I'll introduce Azure DevOps Marketplace and show you how to install the extension Deploy Azure Data Factory (#adftools) created by Kamil Nowinski and step-by-step how to configure our ADF.
AzDO Pipelines
I'll show step-by-step how to configure the The new CI/CD flow described by Microsoft and explain what in which cases I recommend you to use this simplistic, yet, powerful method; I'll also include some best-practices and recommendations.