Home NewsX Deploying .dacpacs to Multiple Environments via ADO Pipelines

Deploying .dacpacs to Multiple Environments via ADO Pipelines

by info.odysseyx@gmail.com
0 comment 14 views


_a661c3f7-beed-4fe9-aabf-26fef2271b43.jpg

introduction

This is the next post covering our series. SQL database. In the last post we Create a single .dacpac and deploy to one environment via Azure DevOps pipeline. In this article, we will expand on this and guide you through the next evolution of building dacpac with configuration for development and TST environments and deploying it to multiple SQL servers and databases.

I would venture to say that this is where we up the difficulty. We will discuss concepts such as pre- and post-deployment scripts, which may vary depending on different environments. In this case, there is a post-deployment script to secure each environment and a seed script for the development environment.

Like other blogs in this series Full code is available via GitHub..

Prerequisites

If you followed along last post You then need to create the appropriate SQL Server, SQL Database, provision the appropriate ADO service account permissions, and set it all up.

If this is your first post, don’t worry. The full list of requirements is as follows:

process

To do this, we need to take a step back and talk about what our high-level workflow looks like. we would like to utilize Pre/post-deployment scripts Set up Entra users and groups in your SQL database. The list of users and roles may vary depending on your environment.

This is as important as reading the documentation. Pre/post-deployment scripts It says: “Pre-/post-deployment scripts .dacpac However, it is not compiled into or validated with the database object model.” This is important: it means you’ll need a different .dacpac file for each environment.

What should I do? One of the more widely used concepts when developing applications is utilization. Build Configuration. These build configurations allow us to effectively provide a variety of configuration settings, i.e. pre- and post-deployment scripts, for each .dacpac generated. Because I’m using SDK process for SQL projectsFor this we will utilize VS Code. Alternatively, you can use Azure Data Studio. Also announce that this process is underway. Preview within Visual Studio.

Now let’s go through these steps and get a high-level overview of what our deployment will look like. I’m going to break this down. Occupation and stage level. Tasks can run in parallel, but steps run sequentially. We’ve skipped the publishing task, but we’ll look at it in the YAML file.

  • stage build
    • Task Build Development .dacpac Task
    • Task build tst .dacpac task
  • Stage Deployment Development
    • Deploying a task dev .dacpac task
    • Run the script to seed data
    • Configure security by running a script
  • Step Deployment TST
    • Job Deployment tst .dacpac
    • Configure security by running a script

Post-deployment script

The first part we’ll cover is creating the post-deployment script. For this exercise, the developer will have two scripts. One to assign Entra group

‘sqlmoveme_[Environment Name]_admins’ is the ‘db_owner’ role and one is the role that seeds the table. In this example, some sample data [SalesLT].[Address] table. For security purposes, you should create an idempotent script that checks if the group already exists in the database and, if not, creates it before assigning permissions. I’ve always been tempted to just delete the user and read. However, I would advise against this if you have problems recreating users or assigning permissions. If the script fails, it will lock the user out of the live database!

security script

USE [sqlmoveme]

IF NOT Exists(SELECT * FROM sys.database_principals WHERE name="sqlmoveme_[Environment Name]_admins") 
    BEGIN  
        CREATE USER  [sqlmoveme_[Environment Name]_admins] FROM EXTERNAL PROVIDER
    END
EXEC sp_addrolemember 'db_owner','sqlmoveme_[Environment Name]_admins';

GO

Seed script:

USE [sqlmoveme]
BEGIN
    DELETE FROM [SalesLT].[Address];
    INSERT INTO [SalesLT].[Address]VALUES('Apartment 5A','129 W. 81st St.','New York','NY','USA','10001',newid(),SYSDATETIME())
    INSERT INTO [SalesLT].[Address]VALUES('2311 North Los Robles Avenue','','Pasadena','CA','USA','91001',newid(),SYSDATETIME())
    INSERT INTO [SalesLT].[Address]VALUES('742 Evergreen Terrace','','Springfield','IL','USA','65619',newid(),SYSDATETIME())
END
GO

Finally, there are some interesting limitations when defining pre/post-deployment scripts. Can only be used on command. That means you can utilize USE/GO. The solution to this is, As documented by MicrosoftThe idea is to create a generic script for your environment that utilizes sqlcmd notation to call additional scripts.

environment script

:r [Environment Name].post.sqlmoveme.security.sql
:r [Environment Name].post.sqlmoveme.seed.sql

Script location

These scripts must be located in the .sqlproj folder. This will link to your .sqlproj.

j_folberth_0-1726687810780.png

In my case I created a folder called ‘scripts’ to keep all environment scripts. It is important to understand that these .sql files are considered part of the project. Therefore, we must provide appropriate accommodations when building projects.

Exclude .sql files from build

Here’s where it gets confusing. Because .dacpac attempts to compile all .sql files in the project, we want to exclude these environment-specific files from the build. This requires updating .sqlproj behind the scenes. First we Tag your script.

  
    
    
    
    
    
    
  

Without this part, the build will fail. For information on this please check here microsoft document.

Create a build configuration

This is the part of the blog where you move into more advanced categories. Let’s see how to do this in VS Code/Data Studio, which requires editing the raw .sqproj. This is currently how VS Code/Data Studio handles project file updates for multiple build configurations. I don’t mean to be overly threatening. If you make a mistake, remember you’re under source control! You can roll back to a previous working version.

Edit .sqlproj file

Double-click to open the .sqlproj file. A VS Code window will open containing your file. You should already have two basic build configurations: Debug and Release. You can update this if you wish. However, I like to match these values ​​to my ADO environment. The first thing we want to do is And we want to set this It can only be used in the desired build configuration. This is done using the following block:

  

Then you want to define the following properties in your conditional:

  bin\dev\
    $(MSBuildProjectName).sql
    false
    true
    full
    false
    true
    true
    prompt
    4

What is most important for the purposes of our movement is . This property tells the SDK where to push the .dacpac. We’ll want different paths for different environments. The rest were set by the default debug or release configuration.

At the time of this writing, there is a gap in documentation about what properties can be set and what the acceptable values ​​look like. You can find the nearest one right now Database Project Settings – SQL Server Data Tools (SSDT) ​​​​| microsoft run This explains the settings that can be set. However, it does not provide a complete list of properties. For up-to-date information on this, please visit: https://aka.ms/sqlprojects

repeat during tst

Now let’s repeat the same steps for the tst environment. Let’s make some adjustments to account for the different configurations a team can have. I will probably remove the seed script as the TST environment will have a more complete data set for testing, and I hope not to override it. Still, this time I want to run the secure script using the ‘sqlmoveme_tst_admins’ Entra group with the dbo_owner role.

Please note that we are using an Azure DevOps environment and need to configure it for the ‘tst’ environment. If this concept is new to you, you can review our previous blog for more details about it. Azure DevOps Pipeline: Environments and variables (microsoft.com)

For the full TST code base, see: Follow the GitHub repository.

YAML build updates

I would like to do an update on my previous post. Leverage DotNet to build SQL through YAML – Microsoft Community Hub. The first update passes the build configuration as a parameter to ‘dotnet build’. This is done by passing the `–configuration` parameter. Feel free to check back to see the full list of availability. dotnet build command – .NET CLI | microsoft run

    - task: DotNetCoreCLI@2
      displayName: dotnet build
      inputs:
        command: build
        projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj
        arguments: --configuration tst /p:NetCoreBuild=true

The second part is that you want to clone the build for the second environment and update it accordingly. Those of you who follow me know that whenever we need to duplicate something, we have to take a look. Azure DevOps Pipelines: The Case for Scaling Templates – Microsoft Community Hub.

In this post, I’m going to go through it step by step and eventually guide you through the process of creating a template. For now, let’s create a second task for the TST environment. For optimal efficiency, this should be a second task so it can be run in parallel rather than creating an additional step or series of tasks for these steps.

- job: build_publish_sql_sqlmoveme_tst
    steps:
    - task: UseDotNet@2
      displayName: Use .NET SDK v3.1.x
      inputs:
        packageType: 'sdk'
        version: 3.1.x
        includePreviewVersions: true
    - task: DotNetCoreCLI@2
      displayName: dotnet build
      inputs:
        command: build
        projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj
        arguments: --configuration tst /p:NetCoreBuild=true
    - task: PublishPipelineArtifact@1
      displayName: 'Publish Pipeline Artifact sqlmoveme_tst_tst '
      inputs:
        targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/tst
        artifact: sqlmoveme_tst_tst
        properties: ''

Let me explain here that all the strange-looking calls occur and my logic is that the artifact name is `sqlmoveme_tst_tst`. This utilizes the following naming pattern: [projectName]_[environmentName]_[configurationName]. You don’t have to do this. However, I follow this pattern to accommodate flexibility and practices across the technology stack. That is, in other languages, there might be scenarios of type tst_android, tst_mac, tst_v1, tst_v2.

Create a Tst Deployment Stage

At this point we have a build that produces two artifacts, one for dev and one for tst. ceremony Deploy .dapacs to Azure SQL through Azure DevOps Pipeline – Microsoft Community Hub The YAML deployment steps for development have already been completed. Now you only need to add one to tst.

- stage: sqlmoveemecicd_tst_cus_dacpac_deploy
  jobs:
  - deployment: sqlmoveemecicd_app_tst_cus
    environment:
      name: tst
    dependsOn: []
   
    strategy:
      runOnce:
        deploy:
          steps:
          - task: SqlAzureDacpacDeployment@1
            displayName: Publish sqlmoveme on sql-adventureworksentra-tst-cus.database.windows.net
            inputs:
              DeploymentAction: Publish
              azureSubscription: AzureTstServiceConnection
              AuthenticationType: servicePrincipal
              ServerName: sql-adventureworksentra-tst-cus.database.windows.net
              DatabaseName: sqlmoveme
              deployType: DacpacTask
              DacpacFile: $(Agent.BuildDirectory)\sqlmoveme_tst_tst\**\*.dacpac
              AdditionalArguments: ''
              DeleteFirewallRule: True

Make note of a few adjustments here.

Now let’s run the pipeline!

final result

Let’s go through all these steps one by one and see what we went through.

First, let’s check out all the Azure DevOps changes we’ve made. The first is to verify that the build generated two .dacpac files as part of the pipeline artifact. One for configuring ‘tst’ and the other for configuring ‘dev’.

j_folberth_0-1726853305565.png

check! Next, since we are in ADO, let’s make sure we have two steps: One step should have a task to deploy to ‘dev’ and the other stage should have a task to deploy to ‘tst’.

j_folberth_1-1726853521428.png

great. One could argue that ADO is easiest to display OK. Now let’s go to SQL and verify that not only have our provisions been executed, but they have also been executed in the appropriate environment. To log in, you may need to add the computer you are connecting to to your SQL Server firewall. IP firewall rules – Azure SQL Database and Azure Synapse Analytics | microsoft run.

Let’s start with ‘dev’ and make sure the Entra group has the correct access permissions. You can do this by running the following query:

SELECT P.Name, R.Name
FROM sys.database_principals P 
LEFT OUTER JOIN sys.database_role_members RM on P.principal_id=RM.member_principal_id 
LEFT OUTER JOIN sys.database_principals R on R.principal_id=RM.role_principal_id
WHERE P.Name="sqlmoveme_dev_admins"

When you run the query, you will see the following results:

j_folberth_2-1726854564489.png

All right. 1 script executed. But have you also run the seed script? In this example, all entries in the Address table have been deleted and repopulated.

j_folberth_3-1726854732928.png

It seems like everything is there.

Now how about ‘tst’? If you remember the key component here is that you’re not only running a variety of scripts, you’re running a varying number of scripts in different environments. Let’s log into the ‘tst’ server and run the user access script to see what happens.

j_folberth_4-1726854992922.png

It seems correct to me.

conclusion

We have successfully built a single .sqlproj as .dacpac for each environment. Each .dacpac has its own post-deployment script to configure things like security, including adding the Entra Security group as dbo. You have then successfully deployed that .dacpac to the appropriate environment where the post-deployment script runs.

next steps

Now that we’ve covered how to build and deploy .dacpac to multiple environments, we’ll move on to creating a YAML template so you don’t have to copy and paste pipeline steps. Subscribe to this series anytime. SQL database Or if you like my posts, feel free to follow me.





Source link

You may also like

Leave a Comment

Our Company

Welcome to OdysseyX, your one-stop destination for the latest news and opportunities across various domains.

Newsletter

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

Laest News

@2024 – All Right Reserved. Designed and Developed by OdysseyX