Demystifying Fabric SQL Databases

Unlock the potential of Fabric SQL Databases! Learn how to migrate your SQL workloads using familiar tools like Database Projects, DACPACs, and BACPACs. Preserve your existing investments while embracing Fabric’s ecosystem with seamless schema and data transitions.

Demystifying Fabric SQL Databases

Introduction

Migrating SQL databases to Microsoft Fabric can seem daunting, especially when organizations have invested years into finely tuned schemas, stored procedures, and workflows. However, the process doesn’t need to be a reinvention. By leveraging well-established SQL tools such as Database Projects, DACPACs, and BACPACs, you can seamlessly migrate your databases into Fabric while maintaining data integrity and compatibility with new features.

This article explores practical techniques to migrate SQL databases to Fabric using these tools, enabling you to make the most of Fabric’s ecosystem without discarding your existing SQL investments.


Database Projects, DACPAC & BACPAC

SQL Database Projects represent the schema of a database in a structured, source-controlled format. They allow you to:

  • Version-Control Your Schema: Track changes to tables, views, stored procedures, and more using Git.
  • Enable CI/CD Workflows: Use tools like Azure DevOps to automate builds and deployments.
  • Ensure Consistency: Deploy the same schema to multiple environments (e.g., dev, test, production) with minimal effort.

A DACPAC (Data-tier Application Component Package) is a portable artifact that defines the entire structure of a database, we use DACPACs to create, update, or revert database schemas, the key benefits are:

🤓
DACPAC are the build artifact from SQL database projects
  • Pre-Deployment Validation: Allow validation of database updates, identifying potential issues like data loss from column size changes.
  • Seamless Integration with DevOps: Eliminates the need for exchanging scripts and ad hoc integration notes,
  • Control Parameters: Allow customize the deployment process, tailoring it to specific requirements and minimizing risks during updates, examples:
    • DropObjectsNotInSource: Determines whether to delete objects in the target that aren’t in the source schema.
    • BlockOnPossibleDataLoss: Prevents deployment if changes could result in data loss, such as altering a column’s data type or reducing its size

A BACPAC (Backup Package) contains both the database schema and data, this is ideal for full database migration (e.g., moving from on-prem to cloud).

Key Differences

FeatureDACPACBACPAC
ContentSchema onlySchema + Data
PurposeSchema evolution/deploymentFull database migration
File SizeSmallerLarger (includes data)
Primary ToolCI/CD (e.g., deployment pipelines)Backup and migration tasks

Demo: From Azure SQL to Fabric SQL

With the Database Project ready, automate deployment with a DevOps pipeline:

💡
👨‍💻REMEMBER TO DOWNLOAD THE CODE!!!! from this AzDO Git repository https://dev.azure.com/techtacofriday/blog-posts/_git/demystifying-fabric-sql-databases 😁🤞 and follow along.

1. Starting with Database Projects

A Database Project is the foundation for managing database schema. Here’s how to get started:

  • Use Azure Data Studio to create a new database project.
  • Import the schema from your existing SQL database using the Schema Compare feature, which highlights differences between your source database and the project.
  • Commit the schema to version control, ensuring you can track and manage changes.
Benefits: Database Projects bring order and repeatability to deployments, helping teams avoid errors and maintain consistency across environments.

2. Deploying the Schema with DACPACs

Once your schema is ready in a Database Project, you can compile it into a DACPAC:

  • Use Azure DevOps pipelines or local tools to build the project into a DACPAC.
  • Deploy the DACPAC to a Fabric SQL Database, using options like /DropObjectsNotInSource to control how existing objects are handled in the target database.
Scenario: A DACPAC deployment is ideal when you want to transfer schema changes without affecting existing data in the target database.

3. Migrating with BACPACs

For cases where both schema and data need to be transferred:

  • Export a BACPAC from your existing database (e.g., Azure SQL or on-prem SQL Server).
  • Restore the BACPAC into a Fabric SQL Database using SqlPackage or Azure DevOps pipelines.
Note: Fabric SQL Databases must be empty before restoring a BACPAC, as it performs a full overwrite.

Conclusion: Transforming SQL Management in Fabric

By leveraging tools like Database Projects, DACPACs, and BACPACs, migrating to Fabric SQL Databases becomes a smooth, predictable process. These techniques allow you to preserve years of SQL investments while unlocking the power of Fabric’s integrated data and AI services.

Whether you're modernizing legacy databases or adopting Fabric for the first time, these approaches provide a clear path forward. Ready to get started? Let me know your thoughts or questions in the comments below!