Tag: Azure Pipelines

Deploying a SQL DB with Azure Pipelines

Normally when I work with SQL Azure I handle DB schema changes with Entity Framework migrations. However if you using Azure Functions rather than Web Jobs it seems there's a number of issues with this and I could not find a decent guide which resulted in a working solution.

Migrations isn't the only way to release a DB change though. SQL Server Database projects have existed for a long time and are a perfectly good way of automating a DB change. My preference to use EF Migrations really comes from a place of not wanting to have an EF model and a separate table scheme when they're essentially a duplicate of each other.

Trying to find out how to deploy this through Azure Devops Pipelines however was far harder than I expected (my expectation was about 5 mins). A lot of guides weren't very good and virtually all of them start with Click new pipeline, then select Use the classic editor. WAIT Classic Editor on an article written 3 months ago!?!?! Excuse me while I search for a solution slightly more up to date.

Creating a dacpac file

High level the solution solution is to have a SQL Server Database project, use an Azure Pipeline to compile that to a dacpac file. Then use a release pipeline to deploy that to the SQL Azure DB.

I'm not going to go into any details about how you create a SQL Server Database project, its relatively straightforward, but the one thing to be aware of is the project needs to have a target platform of Microsoft Azure SQL Database otherwise you'll get a compatibility error when you try to deploy.

Building a SQL Server Database project in Azure Devops

To build a dacpac file create a new pipeline in Azure Devops (the yaml kind), select your repo and get yourself a blank configuration file. Also at this point make sure your code is actually in the repo!

The configuration I used looks like this; I've included notes in the code to explain what's going on.

# The branch you want to trigger a build
trigger:
- master

pool:
vmImage: "windows-latest"

variables:
configuration: release
platform: "any cpu"
solutionPath: # Add the path to your Visual Studio solution file here

steps:
# Doing a Visual Studio build of your solution will trigger the dacpac file to be created
# if you have more projects in your solution (which you probably will) you may get an error here
# as we haven't restored any nuget packages etc. For just a SQL DB project, this should work
- task: VSBuild@1
  displayName: Build solution
  inputs:
    solution: $(solutionPath)
    platform: $(platform)
    configuration: $(configuration)
    clean: true

# When the dacpac is built it will be in the projects bin/configuation folder 
# to get into an artifact (probably with some other things you want to publish like an Azure function)
# we need to move it somewhere else. This will move it to a folder called drop
- task: CopyFiles@2
  displayName: Copy DACPAC
  inputs:
    SourceFolder: "$(Build.SourcesDirectory)/MyProject.Database/bin/$(configuration)"
    Contents: "*.dacpac"
    TargetFolder: "$(Build.ArtifactStagingDirectory)/drop"

# Published the contents of the drop folder into an artifact
- task: PublishBuildArtifacts@1
  displayName: "Publish artifact"
  inputs:
    PathtoPublish: "$(Build.ArtifactStagingDirectory)/drop"
    ArtifactName: # Artifact name goes here
    publishLocation: container

Releasing to SQL Azure

Once the pipeline has run you should have an artifact coming out of it that contains the dacpac file.

To deploy the dacpac to SQL Azure you need to create a release pipeline. You can do this within the build pipeline, but personally I think builds and releases are different things and should therefore be kept separate. Particularly as releases should be promoted through environments.

Go to the releases section in Azure Devops and click New and then New release pipeline.

There is no template for this kind of release, so choose Empty job on the next screen that appears.

On the left you will be able to select the artifact getting built from your pipeline.

Then from the Tasks drop down select Stage 1. Stages can represent the different environments your build will be deployed to, so you may want to rename this something like Dev or Production.

On Agent Job click the plus button to add a task to the agent job. Search for dacpac and click the Add button on Azure SQL Database deployment.

Complete the fields to configure which DB it will be deployed to (as shown in the picture but with your details).

And that's it. You can now run the pipelines and your SQL Project will be deployed to SQL Azure.

Some other tips

On the Azure SQL Database deployment task there is a property called Additional SqlPackage.exe Arguments this can be used to specify things like should loss of data be allows. You can find the list of these at this url https://docs.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage?view=sql-server-ver15#properties

If you are deploying to multiple environments you will want to use variables for the server details rather than having them on the actual task. This will make it easier to clone the stages and have all connections details configured in one place.

Equivalent of the Octopus Package Library for Azure Devops

I’ve been using Team City and Octopus deploy in our CI setups for several years, but over the last 6 months have slowly been moving over to use Azure Devops. This is mostly to move away from needing to keep an application like Team City updated by switching to a SAAS service. Octopus is available as a SAAS service, but as Azure Devops is also capable of doing releases I opted to start with that rather than using Octopus again.

One of my first challenges was how you replace Octopus’s package library. The solutions I’m working on (which are generally Sitecore based), consist of the application we write and store in Source Control, and all the other pre-built parts of Sitecore which we don’t store in source control.

With Octopus deploy we would add these static parts to the Octopus package library and then release a combination of them and our application to a server, wiping what is there before we do. That then gives us the exact same deploy on every target.

With Azure Devops however, things are a bit different.

Azure Artifacts

The closest equivalent of the package library is Azure Artifacts. Rather than a primary purpose being to upload packages to be included in a deploy. The goal of Artifacts is more inline with the output of a pipeline being saved to an Artifact which can then be a package feed for things like NuGet or npm.

Unfortunately, there is also no UI to directly upload a NuGet package to an Artifact feed like there is with Octopus’s package library. However, it is possible to upload a pre-built NuGet package another way using NuGet.exe itself.

Manually uploading a NuGet package to Azure Artifacts

To start you need to create a feed for your package to uploaded to.

Login to Azure DevOps and got to Artifacts. Click Create feed and give it a name.

Click connect to feed and select NuGet.exe. You will see under the heading project setup a nuget.config file to add to your solution.

Create an empty solution in Visual Studio and add a nuget.config file to the root folder using the source from the website. It will look something like this;

<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
  <clear />
  <add key="MyFeed" value="https://pkgs.dev.azure.com/.../_packaging/MyFeed/nuget/v3/index.json" />
</packageSources>
</configuration>

Copy the NuGet package you would like you publish into the root folder as well. I’m using a package we have created for a tool called Feydra. My folder now looks like this.

Before we can publish into our NuGet feed we need to setup a personal access token. This is done from within Azure Devops.

In the top right corner click the person icon and then select profile. On the profile screen you will see a section called Personal access tokens under Security.

From here click New Token and give it the Read, write & manage permission for Packaging.

You will be given a token which you should save a copy of.

Now we are ready to publish are NuGet package to the Artifact feed.

Open a command prompt at the folder containing your solution file and run the following commands

nuget push -Source &lt;SourceName&gt; -ApiKey az &lt;PackagePath&gt;

For me this is as follows:

nuget push -Source MyFeed -ApiKey az .\Feydra.Custom.1.0.0.30.nupkg

You will then be prompted for a username and password which you should use the personal access token for both.

Refresh the feed on the website and you should see you package added to it, ready to be included in a release pipeline.

Using Sitecore TDS with Azure Pipelines

Sitecore TDS allows developers to serialize Sitecore items into a file format which enables them to be stored in source control. These items can then be turned into a Sitecore update package to be deployed into a Sitecore solution.

With tools like Team City it was possible to install the TDS application on the server, and MSBuild would pick it up in the same way that Visual Studio would when you create a build locally. However, with build tools like Azure Piplelines that are a SAAS service you do not have any access to install components on a server.

If your solution contains a TDS project and you run an Azure Pipeline, you will probably see this error saying that the target 'Build' does not exist in the project.

Fortunately, TDS can be added to a project as a NuGet package. The documentation I found on Hedgehogs site for TDS says the package is unlisted on NuGet.org but can be installed in the normal way if you know what it's called. When I tried this is didn't work, but the NuGet package is also available in the TDS download so we can do it a different way.

Firstly download TDS from Hedgehogs website. https://www.teamdevelopmentforsitecore.com/Download/TDS-Classic

Next create a folder in the root of your drive called LocalNuGet and copy the nuget package from the download into this folder.

Within Visual Studio go to Tools > NuGet Package Manager > Package Manager Settings, and in the window that opens select Package Sources on the left.

Add a new package source and set it to your LocalNuGet folder.

From the package manager screen select your local nuget server as the package source and then add the TDS package to the relevant projects.

When you commit to source control make sure you add the hedgehog package in your projects packages folder. Normally this will be excluded as your build will try and restore the NuGet packages from a NuGet feed rather than containing them in the repo.

If you run your Azure Pipleine now you should get the following error:

This is a great first step as we can now see TDS is being used and we're getting an error from it.

If you're not getting this, then open the csproj file for your solution and check that there are no references to c:\program files (x86)\Hedgehog Development\ you should have something like this:

<Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
  <PropertyGroup>
    <ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them.  For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
  </PropertyGroup>
  <Error Condition="!Exists('..\packages\HedgehogDevelopment.TDS.6.0.0.10\build\HedgehogDevelopment.TDS.targets')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\HedgehogDevelopment.TDS.6.0.0.10\build\HedgehogDevelopment.TDS.targets'))" />
</Target>
<Import Project="..\packages\HedgehogDevelopment.TDS.6.0.0.10\build\HedgehogDevelopment.TDS.targets" Condition="Exists('..\packages\HedgehogDevelopment.TDS.6.0.0.10\build\HedgehogDevelopment.TDS.targets')" />

To fix the product key error we need to add our license details.

Edit your pipeline and then click on the variables button in the top right. Add two new variables:

  • TDS_Key - Which should be set to your license key
  • TDS_Owner - Which should be set to the company name the key is linked to

Now run your pipleline again and the build should succeed