Devops
SonarQube for .NET Framework with GitHub Actions

SonarQube for .NET Framework with GitHub Actions

If you haven't tried SonarQube or SonarCloud out then I suggest you do. The cloud version is quite straightforward to setup and from my experience the stuff it finds can be quite insightful. Like all these tools, at times you'll disagree with what they say, but there's always the option to change the rules.

What I particularly like with SonarQube is the examples you get with each bug that clearly explains why there's an issue and what you need to do in order to fix it.

What I didn't like however were the instructions for setting a project using .NET Framework. There are instructions labelled .NET, but this heavily assumes your using .NET Core, which while that might be our general preference, products like Sitecore could force your hand back to .NET Framework and all those legacy projects didn't just go away.

How to setup SonarQube using GitHub Actions for .NET Framework

The GitHub setup instructions (https://docs.sonarqube.org/latest/analysis/github-integration/) will give you the following code to create your GitHub Action with. This is also the same code you will get if you follow the wizard in SonarQube.

1name: Build
2on:
3 push:
4 branches:
5 - master # or the name of your main branch
6 pull_request:
7 types: [opened, synchronize, reopened]
8jobs:
9 build:
10 name: Build
11 runs-on: windows-latest
12 steps:
13 - name: Set up JDK 11
14 uses: actions/setup-java@v1
15 with:
16 java-version: 1.11
17 - uses: actions/checkout@v2
18 with:
19 fetch-depth: 0 # Shallow clones should be disabled for a better relevancy of analysis
20 - name: Cache SonarQube packages
21 uses: actions/cache@v1
22 with:
23 path: ~\sonar\cache
24 key: ${{ runner.os }}-sonar
25 restore-keys: ${{ runner.os }}-sonar
26 - name: Cache SonarQube scanner
27 id: cache-sonar-scanner
28 uses: actions/cache@v1
29 with:
30 path: .\.sonar\scanner
31 key: ${{ runner.os }}-sonar-scanner
32 restore-keys: ${{ runner.os }}-sonar-scanner
33 - name: Install SonarQube scanner
34 if: steps.cache-sonar-scanner.outputs.cache-hit != 'true'
35 shell: powershell
36 run: |
37 New-Item -Path .\.sonar\scanner -ItemType Directory
38 dotnet tool update dotnet-sonarscanner --tool-path .\.sonar\scanner
39 - name: Build and analyze
40 env:
41 GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
42 shell: powershell
43 run: |
44 .\.sonar\scanner\dotnet-sonarscanner begin /k:"example" /d:sonar.login="${{ secrets.SONAR_TOKEN }}" /d:sonar.host.url="${{ secrets.SONAR_HOST_URL }}"
45 dotnet build
46 .\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"

There's two aspects to notice with this. Firstly the Build and analyze section is running a command dotnet build which is fine if your running .Net Core, but for .Net Framework it isn't going to work.

Secondly it's highly likely your solution will use NuGet packages and there's no step in here to restore them.

To setup and restore NuGet packages add in the following steps before the Build and analyze step. Be sure to put your solution filename in the restore command.

1 - name: Setup Nuget
2 uses: Nuget/setup-nuget@v1.0.5
3
4 - name: Restore nuget packages
5 run: nuget restore MySolution.sln

To do a build that will compile your .Net Framework code you will need to use MsBuild rather than dotnet. However if you just swap them over you'll get an invalid command error. First you need to add msbuild to PATH. Change your build steps as follows.

1 - name: Add msbuild to PATH
2 uses: microsoft/setup-msbuild@v1.0.2
3
4 - name: Build and analyze
5 env:
6 GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information, if any
7 shell: powershell
8 run: |
9 .\.sonar\scanner\dotnet-sonarscanner begin /k:"example" /d:sonar.login="${{ secrets.SONAR_TOKEN }}" /d:sonar.host.url="${{ secrets.SONAR_HOST_URL }}"
10 dotnet build
11 .\.sonar\scanner\dotnet-sonarscanner end /d:sonar.login="${{ secrets.SONAR_TOKEN }}"

With that now in place you can now compile some .Net Framework code and have the results sent back to your SonarQube instance.

Exporting a Database from SQL Azure

Exporting a Database from SQL Azure

If you need to copy your SQL Azure database locally then exporting it as a Bacpac file is the simplest route to go.

There's many ways you can do this including PowerShell, REST API, and even SQL Server Management Studio. However I am going to show you how to do it through the Azure Portal.

Exporting a Bacpac file

First off find your DB in the Azure Portal.

In the top set of buttons when you view your DB (make sure you're on the actual DB not the server), you will see an export button 3rd from the left.

Clicking Export takes you to a page where you need to configures the filename to export as, the storage account to export to and the authentication to log into the DB. The storage account is required as a place for the export to be saved, you will get the choice of configuring the container for it to go into, so I like to create one called backups.

Checking the status of the export

When you start the export you will get a confirmation that it has started, but knowing how to check what the progress is isn't obvious.

To check the progress, navigate to the DB server in the portal.

In the left nav, under Data management, select Import/Export history.

From here you can view the status of each of the exports.

Once they are complete you can download them through blob storage.

Managing SQL Azure Users in the Portal

Managing SQL Azure Users in the Portal

Managing users for a SQL Azure DB is something which I have found is more complex that you would expect. A lot of guides will also tell you it's something which can't be done through the admin portal and needs to be done using scripts in the DB.

This is true to some extent. If you want to set specific role permissions to a DB then you have to do it by assigning roles through SQL scripts. Also if you want to set usernames and passwords at a DB level rather than using Active Directory then this also needs to be done in the DB.

However if you want to give a bunch of active directory users admin access to all the DB's in a server or if you want to give a group of people the same access then this can be done through the azure portal.

Admin Permissions For All

When you create your DB instance an admin user will get created, and for some teams you could just share the password. However sharing passwords isn't that great and there is a better way.

In the Azure Portal search for groups in the big search box at the top.

Create a security group with a sensible name, description and add all the members who you want to give admin permission to.

Go to your SQL server resource (this is the parent of the database), and got to the Azure Active Directory setting.

Click the top button to Set Admin, choose your new group and then click save. This will create the user with the correct permissions in the master DB of the server.

That's it, the members of the group will now be able to access any of the DB's on the server by logging in using Active Directory with Password through SSMS, or through the azure portal using Query Editor.

Query editor will actually give you a nice green tick if you have permission to log in.

To add or remove peoples access to the DB, just add and remove them from the group.

If you can't log in it could be due to a firewall permission for your IP rather than an actual login permission.

Permissions to Specific DBs

Giving everyone admin permission to every DB on the instance might not be what your after. Fine for a dev instance, but probably not something you want for production.

Fortunately the same concept of using groups can make life a lot easier but you will need to do some SQL scripting.

Create your group as above and then make sure your logged in as someone who is an active directory admin for the SQL Server. You can do this with the instructions above or if you want to be the only admin then rather than setting a group to be the admin, just set yourself.

Next log into the DB either using SSMS or Query Editor. Personally I prefer to use Query Editor as I'm doing everything else through the portal.

Our first script is to create an external user in our DB. In our case the external user is the group we want to give permission to rather than a specific user.

1CREATE USER [GROUP NAME]
2FROM EXTERNAL PROVIDER
3WITH DEFAULT_SCHEMA = dbo;

This is called adding a contained user to the DB.

Next we need to give the group some role permissions to do something.

1ALTER ROLE db_datareader ADD MEMBER [GROUP NAME];
2ALTER ROLE db_datawriter ADD MEMBER [GROUP NAME];

Repeat these steps for each DB you want to give the group access too.

The members of your new group should now have permissions to the individual DBs with reader and writer permissions.

If you want to give access to more people, just add them to the group.