Web Development
How to create a shared access signature for a specific folder in Azure Storage

How to create a shared access signature for a specific folder in Azure Storage

As we move into a serverless wold where devlopment work is done less on generic multipurpose servers that require patching both to the OS and to the application installed on them, things become easier and in some cases cheaper.

For instance I am currently working on a project that needs to send and recieve XML files from different sources and with no other requirements for a server, Azure storage seemed the perfect fit. As it's just a storage service you pay for what you need and nothing more. All the patching and maintencance of API's is provided by Microsoft and we have an instant place to store out files. Ideal!

However new technology always comes with new challenges. My first issue came with the plan on how we would be getting files in and out with partners. This isn't the first time I've worked with Azure File Storage, but it is the first time I did it without an opps person setting things up. Previously we used FTP, which while dated works with a lot of applications. Like a headphone jack it's not the most glamorus of connection, but it works, everyone knows how it works and everyone has things that work with it. However it transpires that despite Azure storage being 10 years old and reciving requests for FTP from the start, Microsoft have decided to go the same route as Apple with the headphone jack and not have it. Instead the only option for an integration is REST. As it transpires the opps people I had worked with in the past when faced with this issue had just put a VM infront of it, which kind of defeats the point of using Azure storage in the first place!

So we're going with REST and Microsoft provide quite a straightforward REST API all good so far, but how do we limit access? Well there's a guide to Using the Azure Storage REST API which contains a section on creating an authorisation header. It's long and overly complex, but does point you in the direction that to do this you need a Shared Access Signature. The other option is an access key, but this is something you should never give away to a third party.

Shared Access Signature

After a bit more digging through the documentation (and just clicking the thing that sounded right in the potal) I found this documentation on creating an Account SAS which sounded like what I wanted (it wasn't, but it's close).

With a shared access signature you can say what kind of service should be allowed, what permissions they should have, IP address's, start and end dates. All awesome things.

Once I had this I could then use the REST API, but there was a problem. I could access every folder in the storage account and there was no way to stop this! For integrating with 2 third partys they would both be able to access each others stuff, and our own private stuff.

There is also no way to revoke the SAS once it's been generated other than refreshing the access keys which would affect everyone.

Folder Level Shared Access Signature

After a bit more research I found what I was looking for. How to create a shared access signature at a folder or item level and how to link it to a policy.

The first thing you need is Azure Storage Explorer. Once your set up with this you will be able to view all your storage accounts.

From here you are able to browse to the folder you want to share right click it and choose Manage Access Policies.

This will open a dialoge to manage the policies for this specific object rather than the account.

Here you can set all the same permissions as you could for a signature at an account level but now for a specific object and against a policy rather than an actual signature, meaning the policy can be updated in the future with no change to the signature.

Better still you can remove the policy which will then invalidate any signature using it.

For the actual signature key right click the same folder and click Get Shared Access Signature.

Then in the dialoge select the policy from the drop down rather than spcifying the individual permissions.

Click create and you can copy the keys.

You now have an access key that is limited to a specific folder rather than the entire account.

This is only possible to do though one of the code/scripting interfaces. e.g. PowerShell or the storage explorer. The azure portal will only let you get signatures at an account level.

Setting up local https with IIS in 10 minutes

Setting up local https with IIS in 10 minutes

For very good reasons websites now nearly always run under https rather than http. As dev's though this gives us a complication of either removing any local redirect to https rules and "hoping" things work ok when we get to a server, or setting local IIS up to have an https binding.

Having https setup locally is obviously a lot more favourable and what has traditionally been done is to create a self signed certificate however while this works as far as IIS is concerned, it still leaves an annoying browser warning as the browser will recognise it as un-secure. This can then create additional problems in client side code when certain things will hit the error when calling an api.

mkcert

The solution is to have a certificate added to your trusted root certificates rather than a self signed one. Fortunately there is a tool called mkcert that makes the process a lot simpler to do.

https://github.com/FiloSottile/mkcert#windows

Create a local cert step by step

1. If you haven't already. Install chocolatey ( https://chocolatey.org/install ). Chocolatey is a package manager for windows which makes it super simple to install applications. The name is inspired from NuGet. i.e. Chocolatey Nuget

2. Install mkcert, to do this from a admin command window run

1choco install mkcert

3. Create a local certificate authority (ca)

1mkcert -install

4. Create a certificate

1mkcert -pkcs12 example.com

Remember to change example.com to the domain you would like to create a certificate for.

5. Rename the .p12 file that was created to .pfx (this is what IIS requires). The certificate will now be created in the folder you have the command window open at.

You can now import the certificate into IIS as normal. When asked for a password this have been set to changeit

ASP.NET Core Platforms for a Blog

ASP.NET Core Platforms for a Blog

Like a lot of Sitecore developers my blog (at time of writing) is hosted on Wordpress. The reason for it not being in Sitecore is simple. Sitecore is an enterprise level platform, which isn't really needed for a personal blog.

For a .net dev to have there blog on a php platform however just seems plain wrong, but again there's a logical reason. Wordpress is actually really good as a blogging platform, and it doesn't cost me anything.

Despite this I would much rather take control of my site and use it to play with all the cool features in Azure. It would also be nice to have the ability to do something about the Google PageSpeed result which is currently sitting at 24%. So in aid of this I've started looking into .net core based platforms and thought I'd share what I've found.

Miniblog.core

https://github.com/madskristensen/Miniblog.Core

As the name suggests Miniblog.core is both very small and based on .net core. Developed by Mads Kristensen its an extremely lightweight bare bones implementation, which if your after something you can help build upon is ideal. The code is straightforward to understand and very simple to adapt. Additionally if your after a 100% page speed score, then this achieves just that.

If on the other hand your after a deluxe admin experience full of functionality then this probably isn't for you.

Piranha CMS

http://piranhacms.org/

Piranha CMS is built as a lightweight CMS platform rather than specifically as a blog, however it also contains a blog module which for me put's it at a big advantage over the other CMS platforms I've listed below.

On the back end you get a choice of SQL Server, SQLite or MySQL. The documentation isn't exactly complete, but on the day I tried it out, I found the team building it very responsive on GitHub. They even updated the documentation with one of my suggestions the very next day.

Another aspect I particularly liked about Piranha CMS was it's block editor, which from the brief look I've had so far reminds me of the block editor Umbraco has. Whereas other platforms in this list were restricted to a large rich text field.

Orchard Core

https://github.com/OrchardCMS/OrchardCore

Orchard Core is the dot net core version of the Orchard CMS. It's currently in beta, but I'm not sure that put's it at much of a disadvantage over the others on this list.

My initial impressions of Orchard Core however weren't as high as Piranha CMS. The admin interface wasn't quite as nice and as far as I could tell, it didn't have anything like Piranha's block editor. The solution itself also seemed far more complex and I wasn't certain what I got for this. I expect Orchard Core is likely better in some ways that I have yet to discover, but for my needs as a blog this is probably not the case. It also didn't have a blog module out of the box.

Squidex

https://squidex.io/

I have't had much of a chance to play with Squidex yet, but it does offer an interesting difference to the others mentioned so far.

For a start Squidex is an entirely headless cms, and is built around the concept of CQRS and Event Sourcing. Unlike the others it also uses MongoDB rather than a SQL based database.

Where MongoDB is concerned, I often get the impression people are using it because as developers we tend to have a preference to using something new rather than something adequate. However when it comes to Azure pricing, there is potentially a saving to be made by using Mongo rather than Azure SQL.

Redirecting to login page with AngularJs and .net WebAPI

Redirecting to login page with AngularJs and .net WebAPI

So here's the scenario, you have a web application which people log into and some of the pages (like a dashboard) contain ajax functionality. Inevitably the users session expires, they return to the window change a filter and nothing happens. In the background, your JavaScript is making http calls to the server which triggers an unauthorised response. The front end has no way to handle this and some errors appear in the JS console.

A few things are actually combining to make life hard for you here. Lets take a look at each in more detail.

WebAPI and the 301 Response

To protect your API's from public access a good solution is to use the Authorize attribute. i.e.

1public ActionResult GetDashboardData(int foo)
2{
3 // Your api logic here
4
5}

However chances are your solution also has a login page configured in your web.config so that your regular page controller automatically trigger a 301 response to the login page.

1 <authentication mode="Forms">
2 <forms timeout="30" loginUrl="/account/sign-in/" />
3 </authentication>

So now what happens, is instead or responding with a 401 Unauthorised response, what's actually returned is a 301 to the login page.

With an AJAX request from a browser you now hit a second issue. The browser is making an XMLHttpRequest. However if that request returns a 301, rather than returning it your JavaScript code to handle, it "helpfully" follows the redirect and returns that to your JavaScript. Which means rather than receiving a 301 redirect status back, your code is getting a 200 Ok.

So to summarise your API was set up to return a 401 Unauthorised, that got turned into a 301 Redirect, which was then followed and turned into a 200 Ok before it gets back to where it was requested from.

To fix this the easiest method is to create are own version of the AuthorizedAttribute which returns a 403 Forbidden for Ajax requests and the regular logic for anything else.

1using System;
2using System.Web.Mvc;
3
4namespace FooApp
5{
6 [AttributeUsage(AttributeTargets.Method)]
7 public class CustomAuthorizeAttribute : AuthorizeAttribute
8 {
9 protected override void HandleUnauthorizedRequest(AuthorizationContext filterContext)
10 {
11 if (filterContext.HttpContext.Request.IsAjaxRequest())
12 {
13 filterContext.Result = new HttpStatusCodeResult(403, "Forbidden");
14 }
15 else
16 {
17 base.HandleUnauthorizedRequest(filterContext);
18 }
19 }
20 }
21}

Now for any Ajax requests a 403 is returned, for everything else the 301 to the login page is returned.

Redirect 403 Responses in AngularJs to the login page

As our Ajax request is being informed about the unauthorised response, it's up to our JavaScript code trigger the redirect in the browser to the login page. What would be really helpful would be to define the redirect logic in one place, rather than adding this logic to every api call in our code.

To do this we can use add an interceptor onto the http provider in angular js. The interceptor will inspect the response error coming back from the XmlHttpRequest and if it has a status of 401, use a window.locator to redirect the user to the login page.

1app.factory('httpForbiddenInterceptor', ['$q', 'loginUrl', function ($q, loginUrl) {
2 return {
3 'responseError': function (rejection) {
4 if (rejection.status == 403) {
5 window.location = loginUrl;
6 }
7 return $q.reject(rejection);
8 }
9 };
10}]);
11
12app.config(['$httpProvider', function ($httpProvider) {
13 $httpProvider.defaults.headers.common['X-Requested-With'] = 'XMLHttpRequest';
14 $httpProvider.interceptors.push('httpForbiddenInterceptor');
15}]);

You'll notice a line updating the headers. This is to make the IsAjaxRequest() method on the api recognise the request as being Ajax.

Finally you'll also notice the loginUrl being passed into the interceptor. As it's not a great idea to have strings like urls littered around your code, this is using a value recipe to store the url. The code to do this is follows:

1app.value('loginUrl', '/account/sign-in?returnurl=/dashboard/');