Leveraging DotNet for SQL Builds via YAML by info.odysseyx@gmail.com August 28, 2024 written by info.odysseyx@gmail.com August 28, 2024 0 comment 15 views 15 introduction Welcome, if you are new to this series of posts, welcome. SQL database. We discussed before How to import an existing database into your local environment. This section specifically describes leveraging Azure DevOps Pipelines to build the database as a .dacpac file. This SQL .dacpac is used for deployment. Ultimately, you will leverage YAML pipeline templates to achieve this in all your database projects. Wait, why? Anyway, starting with the question of why will help you understand the point of this exercise better. The goal here is to create a .dacpac that can be deployed to a suitable Azure environment. To achieve this, you need to write an automated process that takes the .sqlproj and builds it into a reusable .dapac. Here is one important step that is often overlooked. This .dapac must be static! This means that if you re-run the deployment process, you will need to deploy the same dacpac again. This is an important concept to understand. Discuss multi-stage distribution And that potentially means rollbacks for everything. Prerequisites We’ll continue with the same software requirements from the last post, adding a few more details. Version Control I want to take a moment to point this out. Yes, it is a prerequisite. But in my experience, this can be one of the biggest gaps when implementing any type of automated database deployment. Traditionally, when we think of version control technologies like git or SVN, we think of them as being reserved for application developers. Technology and roles have evolved, and that is no longer the case. Data engineers who leverage Databricks, Data Factory, or SQL Development should expect that they are using a version control system, which allows them to collaborate quickly, deploy code at any time, and provide a record of all changes made, including the history and reasons for the changes. Build Steps When writing one of these, I find it helpful to write out the individual steps required to build it. In our case, this is how it is structured: Posting SQL script folder for pipeline usage (I used security as a placeholder, but this could be a pre/post script) Get the appropriate version of the .NET SDK Running DotNetCore build against .sqlproj Publish a .dapac file for pipeline use. So that’s 4 tasks! I’ve color coded them to show the dependencies between the tasks. This means that you should use two tasks to optimize your build process. Publish the script folder This is really optional, but in my experience, when deploying a SQL database, there are many instances where you need to run a script on the server before or after deployment. I’m going to explain it because I think it’s a common request and a common requirement. jobs: - job: Publish_security steps: - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact security ' inputs: targetPath: security artifact: security properties: '' Pretty basic, right? We effectively need to pass the name of the source folder and the name we want to call the artifact by. We’ll use: Posting Pipeline Artifacts@1 Day. .dacpac build task This is actually the bulk of our operation, so it’s a little longer, but don’t worry, we’ll go through each step. - job: build_publish_sql_sqlmoveme steps: - task: UseDotNet@2 displayName: Use .NET SDK v3.1.x inputs: packageType: 'sdk' version: 3.1.x includePreviewVersions: true - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration Release /p:NetCoreBuild=true - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_dev_Release ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/Release artifact: sqlmoveme_dev_Release properties: '' Okay, that’s not so bad. These tasks are included in the same task because one cannot run without the other. That is, you cannot publish a file that cannot be built. So let’s look at each of these tasks. Use DotNet@2 This is another item that is considered optional, but it allows you to control versions. The .NET Core SDK will be used to build the project. If nothing is specified, it will use the latest version of the build agent. It is recommended to put it here, as it gives you more control. DotNetCoreCLI@2 This task allows us to run the dotnet command on the build agent. This is important because it takes the .sqlproj and builds it into a deployable .dacpac, so we need to tell it a few things. command: compose (the command we want to execute) Project: $(build.source directory)/src/sqlmoveme/*.sqlproj (Location of .sqlproj to build) claim: –configuration release /p:NetCoreBuild=true –output sqlmoveme/release (Additional arguments required. For this argument, we say ‘Release’ for the project check. We also explain that this will be ‘NetCoreBuild’. This argument is now optional as it is the default, but may still be marked as required in older documentation. We also specify the output directory for the built artifacts.) If you’re wondering what this $(Build.SourcesDirectory) argument is, it’s a build-in. Azure DevOps Variables. “Local path of the agent where the source code files are downloaded” as defined in the MS documentation. In other words, the build agent on which the code is executed downloads the repository directly to its local location. This variable represents the local directory of the code on the build agent. The final step of this task is to get our output. DotNetCoreCLI@2 The task of creating and pushing a .dacpac file copies it as a pipeline artifact. You must provide a relative location to the $(Build.SourcesDirectory) that NetCoreBuild outputs as the source to publish. Note that the artifact name includes the project, environment, and release configuration names. This helps with future expansion, as it does not rule out combinations where you need to run multiple builds for multiple environments for different configurations. Final result I have a pipeline that publishes the following artifacts: Here is the full YAML pipeline job definition: trigger: none pool: vmImage: 'windows-latest' stages: - stage: bicepaadentra_build jobs: - job: Publish_security steps: - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact security ' inputs: targetPath: security artifact: security properties: '' - job: build_publish_sql_sqlmoveme steps: - task: UseDotNet@2 displayName: Use .NET SDK v3.1.x inputs: packageType: 'sdk' version: 3.1.x includePreviewVersions: true - task: DotNetCoreCLI@2 displayName: dotnet build inputs: command: build projects: $(Build.SourcesDirectory)/src/sqlmoveme/*.sqlproj arguments: --configuration Release /p:NetCoreBuild=true - task: PublishPipelineArtifact@1 displayName: 'Publish Pipeline Artifact sqlmoveme_dev_Release ' inputs: targetPath: $(Build.SourcesDirectory)/src/sqlmoveme/bin/Release artifact: sqlmoveme_dev_Release properties: '' All source code for this can be found here: Public Repository Next Steps Now that we’ve covered how to efficiently build a .sqlproj into a .dacpac for deployment, the next step is to deploy that .dacpac to SQL Server! Subscribe to this series. SQL database Or if you like my post feel free to Follow me. Source link Share 0 FacebookTwitterPinterestEmail info.odysseyx@gmail.com previous post Secure APIM and Azure OpenAI with managed identity next post Step-by-Step Guide for Server Manager, PowerShell, and DISM You may also like 7 Disturbing Tech Trends of 2024 December 19, 2024 AI on phones fails to impress Apple, Samsung users: Survey December 18, 2024 Standout technology products of 2024 December 16, 2024 Is Intel Equivalent to Tech Industry 2024 NY Giant? December 12, 2024 Google’s Willow chip marks breakthrough in quantum computing December 11, 2024 Job seekers are targeted in mobile phishing campaigns December 10, 2024 Leave a Comment Cancel Reply Save my name, email, and website in this browser for the next time I comment.