Why is my Session ID changing on 3D Secure payments?

If you have a website where you are implementing 3D Secure payments, you may find that you have an issue where on receipt of the payment setup confirmation the users Session ID has changed with no apparent cause.

Lets have a quick run through of the payment process in this scenario (this is roughly how SagePay and WorldPay both work):

  1. User completes payment details on your site (some wizardry normally happens at this point with an iFrame for the card number to maintain PCI compliance) and the form is submitted to your server
  2. You server call’s an API from the payment gateway to setup a 3D Secure payment. It passes the users Session ID along with all the payment details.
  3. Payment gateway responds with a URL for you to redirect the user too by posting a form to it. You do this, likely in an iFrame so that the page still looks like your website (otherwise it’s a very ugly page).
  4. User may or may not get prompted by some sort of authentication by the bank. This could be something like receiving a text message with a code to enter.
  5. When authenticated the user is sent back to your website (in the iFrame) with a post request.
  6. Your server picks out the form details from the request and then calls an API to complete the transaction. In this API call you send the Session ID so that the payment gateway can validate it is the same as the one at the start of the process.
  7. Confirmation shown to the user.

Introducing the Same Site Cookie Policy

In 2020 a change made to how cookies function in browsers to defend against cross site scripting. Troy Hunt has a brilliant explanation of the issue with how cookies used to work and how this has changed here. I’m going to try a much shorter explanation;

When a request is made from a browser, as part of the request all the cookie values for that domain are sent with the request. This will include one for the Session ID. The theory here is that because the cookies are only being sent to the domain which set them in the first place, then information is only being shared back with the place that set it to begin with, which is therefor safe.

However the workings of the internet and what domain a button click might call isn’t overly obvious to most people. So what if clicking a link on one site causes the user to be redirected to another site? Answer: all the cookies are still sent. The same thing happens if a form is posted from one site to another site. The problem here is that if you are authenticated on the other website, then its possible for a cross site scripting attack to be using your session via your browser without you even realising you were on the site again. This is why it’s always a good idea to log out of websites!

The introduction of the same site policy changes how cookies work with three options:

  1. None: which is what the browsers used to do. i.e. send all the cookies with cross-origin request
  2. Lax: some limits on sending cookies with cross-origin request
  3. Strict: tight limits on sending cookies with cross-origin request

None is sending everything and a strict policy will basically stop cookies from being sent when they have a cross-origin request.

A Lax policy is slightly more interesting though, because it depends if the request is a GET or a POST. If it is a GET then the cookies will still be sent, which means that if you follow a link from another site or a search engine your cookies will still be sent to the site. However a POST (like what the 3D Secure page is doing) will no longer send the cookies.

If the policy isn’t set, then Lax is used as the default.

So why is my Session ID changing?

The problem is that post request back to your site, unless the Session ID had a cookie policy of None then the Session ID cookie won’t be sent. The server will then see that there is no Session ID and treat the users as if they are new on the site and as a result start a new session. From this point on the user has lost the old session and you can’t complete the payment. Worse still they’ve probably just been logged out and anything else using session data has also been lost.

In .Net 4.7.2 and up, Microsoft has implemented the ability to set the cookie policy of the session ID. You can do this in your web.config file like this:

<configuration>
 <system.web>
  <anonymousIdentification cookieRequireSSL="false" /> <!-- No config attribute for SameSite -->
  <authentication>
   <forms cookieSameSite="None" requireSSL="false" />
  </authentication>
  <sessionState cookieSameSite="None" /> <!-- No config attribute for Secure -->
  <roleManager cookieRequireSSL="false" /> <!-- No config attribute for SameSite -->
 <system.web>
<configuration>

You can find more about this here. In everything older the policy wont be set and will default to Lax.

However just because you can set the cookie policy to None, it doesn’t mean you should. After all, that just re-opens the vulnerability the browser was trying to protect against.

The solution I went with was to have a page that the users is redirected back to from the payment gateway that does nothing other than re-submit the values from the payment gateway to the site again. This way I can be sure of what form data is being posted to the site, and when I re-post it, it is a same site post and not a cross-origin which means the Session ID cookie will be sent.

To stop my initial page setting a new session cookie (which it will try to do because it won’t recieve one), I use the IIS URL Rewrite module to strip out the set cookie response headers from the page.

You can do this with an outbound rule as follows:

<rewrite>
      <rules>
      </rules>
      <outboundRules>
        <rule name="Remove SetCookie Header" preCondition="Match Payment Page">
          <match serverVariable="RESPONSE_Set-Cookie" pattern=".*" />
          <action type="Rewrite" value="" />
        </rule>
        <preConditions>
          <preCondition name="Match Payment Page" logicalGrouping="MatchAny">
            <add input="{REQUEST_URI}" pattern="PAGE NAME GOES HERE" />
          </preCondition>
        </preConditions>
      </outboundRules>
    </rewrite>

With this solution the cookies are still secure as the policy is set to Lax and I can take payments using 3D Secure which will soon become a requirement.

Using compile options for version compatibility

Here’s the scenario; Your building a module and it needs to be compatible with different versions of a platform. e.g. Sitecore, and everything’s great up until the day you need to call different methods in different versions of the platform. You’d rather not drop support for the old versions, and nor do you want to start maintaining two code bases. So what do you do?

C# Preprocessor Directives

Preprocessor directives provide a way to give the compiler instructions to follow while its compiling a project. By using this we can give the compiler conditions to compile different versions in different ways. Thereby allowing us to maintain one codebase, but produce compilations for different versions of the platform. e.g. One for Sitecore 8.0 and another for Sitecore 9.0.

#if, #else and #endif

When the compiler encounters an #if followed by an #endif, it will only compile the code between the two if the specified symbol had been defined.

#if DEBUG
    Console.WriteLine("Debug version");
#else
    Console.WriteLine("Non Debug version");
#endif

Defining a preprocessor symbol

For the if statement to work, your going to need to define your symbol which is being evaluate.

This can be included in code as follows

#define YOURSYMBOL

A more useful was of defining this however is to include it in your call to MSBuild (this is particularly useful when using a build server).

-define:name[;name2]

If your compiling from Visual Studio an easier solution is to set up a new build configuration with a conditional compilation symbol.

  1. Right click your solution item in Solution Explorer and select Properties
  2. Click Configuration Properties on the left and then Configuration Manager on the right
  3. In the pop up window click the Active solution configuration drop down and then click New
    BuildConfiguration
  4. Enter the name of the build config. In my example above I have SC82 for Sitecore 8.2 and SC90 for Sitecore 9.0.
  5. Click Ok and close all the windows you just opened
  6. Right click the project that your going to build and select Properties
  7. Select the Build tab
  8. Select your build configuration from the configuration at the top
  9. Enter the symbol your using for the #if directives
    Conditional Compilation Symbols

Reference different versions of an assembly

Adding conditions to our code is good, but for this to fully work we also need to reference different versions of the assemblies that are causing the issue in the first place.

There’s no way of doing this through Visual Studio but by editing the .csproj file manually we can update the hint path on a reference to include the configuration name as a variable.

    
       ..\libraries\$(Configuration)\Sitecore.Kernel.dll
      False
    

This example shows how different versions of the Sitecore Kernel can be referenced by keeping each version in a subfolder that corresponds with the build configuration name.

As well as different versions of assemblies, it may also be needed to target different versions of the .net framework. This can be done in the .csproj file by including additional proerty groups that have a condition on the configuration name.

  
    bin\SC82\
    TRACE;SC82
    true
    v4.5.2
  
  
    bin\SC90\
    TRACE;SC90
    true
    v4.6.2
  

In this example I’m targeting .net 4.5.2 for my Sitecore 8.2 configuration and 4.6.2 for my Sitecore 9 configuration.

Useful Links

C# preprocessor directives
-define (C# Compiler Options)

Sitecore: Extend profile matching over multiple visits

In Sitecore, to gain a better understanding of our visitors interests we have the ability to define Profile Keys and Cards to tag our content with. As our visitors navigate through the site, this data is used by Sitecore to build a profile of the visitor. A pre-defined Pattern Card that most resembles the visitors profile is then assigned to the visitor which can be used as the basis of selecting the content that should be displayed on a page for that visitor.

However what this doesn’t do is carry the visitors profile over multiple sessions. Each time a visitor comes back to the site within a new session, the visitors profile key values are reset back to zero.

So what’s Sitecore actually doing?

Before working out how to carry this information between visits, lets look at how a profile is actually being created.

If we look in the Profiles table within the Analytics database we can see the profile data that’s been recorded for a visitors visit.

Sitecore profile data

The Pattern Values column contains the current profile key scores for each key the visitor has a score for. e.g.

background=40;scope=50

If the visitor was to visit a page which has scope score of 5 and background score of 10 these values would be added to the visitors current key scores. e.g.

background=50;scope=55

When a pattern card is assigned, the card with the closest shape of keys is chosen. e.g. If the visitor has a high value for background and low value for scope they will be assigned a pattern card with similar proportional key values.

How do we extend this over multiple visits?

So the easiest way to carry the visit information from one visit to the next would be to simply copy the profile key values from the last session to the next. The code for this would look similar to the following:

var currentVisitIndex = Tracker.CurrentVisit.VisitorVisitIndex;
 
if (currentVisitIndex <= 1 || !Tracker.CurrentVisit.Profiles.Any())
{
    return;
}

var previousProfiles = Tracker.Visitor.GetVisit(currentVisitIndex - 1, VisitLoadOptions.All).Profiles;

foreach (var profile in previousProfiles)
{
    var currentProfile = Tracker.CurrentVisit.GetOrCreateProfile(profile.ProfileName);

    currentProfile.BeginEdit();

    foreach (var ProfileKey in profile.Values)
    {
        currentProfile.Score(ProfileKey.Key, ProfileKey.Value);
    }
    currentProfile.UpdatePattern();

    currentProfile.EndEdit();
}

Now the visitors profile is how it was when they left and crucially we can use this data to personalize the sites homepage for the visitor.

So why shouldn’t we do this?

As simple as this is, it comes with one potentially massive downside. If we go back to the way the profile values are built up they key values are essentially just being accumulated. Each time the visitor visits an item with a background score of 10, the visitors background profile key score in increased by 10.

Our visitors are humans going through different stages of there life, with constantly changing jobs and interests. There’s nothing to ever reduce a profile keys score other than the fact everything is normally zeroed on each visit. By copying the data from the last visit on the start of the next this would never happen and the profile key’s will continue to count up forever. The key value obtained from an item viewed 2 months ago would counted as just as important as the value from another key viewed on an item today.

So if you were running a travel site and a visitor looked at summer holidays for 3 weeks they will have a profile highly weighted towards summer holidays. If they then started to look at winter holidays we wouldn’t want them to have to look at winter holidays for 3 weeks just to have an even likeness of summer and winter.

Overcoming this issue isn’t so simple and largely depends on your business needs. If your visitors interests could change each week then you need something that will degrade the old visit data values quickly. Whereas if your trying to differentiate between people that are in a 2 week vs 6 month buying pattern, you need to retain that data a lot longer.

Some things we can do when copying the data from the visitors previous profile though could include:

  • Halving the profile scores, or reducing by a different factor. This would reduce the importance of values obtained on previous visits. So if a visitor received a 10 on the first visit, it would be worth 5 on the second, 2.5 on the third etc
  • Look at the date of the last visit. Is it to old to be relevant still or can we use the age to determine what factor we should reduce the scores by
  • Look at a combination of multiple last visits to establish what the recent scores were

All these ideas though need to be used on conjunction with what your trying to profile. If it’s age then you know people are going to get older. If it’s an interest that will change frequently then you know the data needs to degrade quickly, but if it’s male/female then that doesn’t necesserally need to degrade at all.

Bundling and Minification error with IIS7

.NETs standard tools for bundling and minification are a great asset to the platform solving a lot of problems with only a few lines of code.

However if using IIS 7.0 you may run into a strange issue where the path to your bundles gets created but a 404 is returned whenever there accessed. The functionality is obviously installed and working otherwise the URL wouldn’t be created, but a 404 clearly isn’t what you want.

The solution lies in your web.config file by setting runAllManagedModulesForAllRequests to true

<system.webServer>
    <modules runAllManagedModulesForAllRequests="true">
    </modules>
</system.webServer>

Convert string or int to enum

Enum’s a great, but you may be wondering how you can turn an integer or string value into the corresponding enum. For example you may have an api that’s being sent XML or JSON, and then you need to turn one of the values within that into an enum to set on an object in your code.

Well it’s very simple. If you have a string do this:

YourEnum foo = (YourEnum) Enum.Parse(typeof(YourEnum), yourString);

or if you have an int do this (you could also just do the first example with a ToString() on the end.

YourEnum foo = (YourEnum)yourInt;

Back to basics string vs StringBuilder

This is simple stuff but is something I see people easily miss by just not thinking about it.

A string is an immutable object, which means once created it can not be altered. So if you want to do a replace or append some more text to the end a new object will be created.

A StringBuilder however is a buffer of characters that can be altered without the need for a new object to be created.

In the majority of situations a string is a perfectly reasonable choice and creating an extra 1 or 2 objects when you appened a couple of other strings isn’t going to make a significant impact on the performance of your program. But what happens when you are using strings in a loop.

A few weeks ago one of my developers had written some code that went through a loop building up some text. It looked a little like this:

string foo = "";

foreach (string baa in someSortOfList)
{
    foo += " Value for " + baa + " is: ";

    var aValue = from x in anotherList
                 where x.name == baa
                 select x;

    foo += aValue.FirstOrDefault().value;
}

Everything worked apart from the fact it took 30seconds to execute!

He was searching through convinced that the linq expressions in the middle was what was taking the time, and was at the point of deciding it could not go any faster without a new approach.

I pointed out not only had he used strings rather than a StringBuilder, but the loop also created around 10 string objects within it. The loop which repeated a couple thousand times was therefore creating 20000 objects that weren’t needed. After we switched froms strings to a StringBuilders the loop executed in milliseconds.

So remember when your trying to work out why your code may be slow, remember the basic stuff.

Image resizing

Using 3rd party components while essential, can also be a headache. All to often you can find yourself spending hours following the setup instructions to the letter only for it to not work. Other times you know it’s the exact component you want to use but it seems while the guys may have spent a huge effort in writing to component they seemingly never got round to writing anything to tell you how to use it.

Every so often though I come across a component that works first time and does exactly what you want. One such component that I’ve found this with recently ImageResizer.

I needed a simple way for images to be resized for a product listing on an eCommerce platform written in Classic ASP. Image resizing can be a real pain, there’s the decision of do you do it when the images are uploaded or on the fly, if you do it on the fly there’s a a huge number of mistakes you can make and nearly every example of how to do image resizing includes a couple of them. Then lastly in either scenario you need to make sure the end result is actually decent quality.

With image resizer I was done in 5 minutes! It’s actually a .net component rather than Classic ASP but that wasn’t an issue. You create a bin folder, add a couple of simple lines to your web.config file then put an extra extension on the image path plus parameters for the size you want and that’s it!

If your doing a .net project it’s also available on NuGet making it even easier to get up and running.

LINQ to SQL Inserts and Deletes

Inserting and Deleting records in a database using LINQ to SQL is just as easy as selecting information. What’s not so easy is actually finding out how to do it. There are lots of excellent blog posts around such as this one by Scott Guthrie http://weblogs.asp.net/scottgu/archive/2007/07/11/linq-to-sql-part-4-updating-our-database.aspx, however most of them we’re all written for the Beta version of LINQ to SQL which let you do a .Add() or .Remove() on your table, which was  changed on the final release. 

So to insert do something like this: 

DataClassesDataContext dataContext = new DataClassesDataContext(); 

//Create my new Movie record
Movie movie = new Movie();
movie.Name = "Tim's movie"; 

//Insert the movie into the data context
dataContext.Movies.InsertOnSubmit(movie); 

//Submit the change to the database
dataContext.SubmitChanges();

And to delete do something like this:

DataClassesDataContext dataContext = new DataClassesDataContext();

var movies = from m in dataContext.Movies
                  where m.Name == "Tim's movie"
                  select m;

dataContext.Movies.DeleteAllOnSubmit(movies);

dataContext.SubmitChanges();

LINQ to SQL Connection Strings

LINQ to SQL is great but like all great things at some point it does something that you don’t expect and gives you a headache. An example of this happened to me this week with the differences between how connection strings are handeled when you LINQ to SQL model is in a class library rather than a Website or Web Application.

What makes this issue particulalry annoying is that it only appears when you try and change the database server that your code is looking at which could end up being when it’s going live or moving to a staging server.

So we all know about connection strings, their quite simple and you just store them in your web.config file, which is how LINQ to SQL works when your using them in a Website. But as soon as you move them to a class library things change. First your connection string name is no longer that simple name you gave it e.g. ConnectionString, now it is prefixed with the namespace which is annoying but not the end of the world. Second discovery though is no matter what you do, it just doesn’t seem to pick up the connection string from the web.config file. Reason being your origional connection string has now compiled itself in the class library’s dll and that is what it is using.

The Solution

Depending when you discovered this the solution is not to bad as you either have a lot of code to change or only a small amount. You can always pass a connection string to the constructor when you are createing an instance of the data context e.g.

DataClasses1DataContext da = new DataClasses1DataContext(connectionstring);

You can also set the connection string on your LINQ to SQL model to be blank, this will remove the default contructor and force you to pass a connection string. This way you web application has the choice of what connection string to use and you can keep re-using your class library in different projects.

Introduction to Real Time Search

The Real Time Search this article relates to is a dll that comes included in the Web Client Software Factory one of Microsoft’s patterns and practices downloads. What it can do and what this example will demonstrate is adding the ability cause a post back that will refresh an update panel as the users types into a search box. This can give a great effect and make a web app really user friendly in scenarios like searching photo’s, emails or any general list.

First late me state though that this is in no way the most optimal way program. In most scenarios you could built a better result using something like JSON as their will be a lot less data transfer, which is also a general reason to avoid update panels. However this is also very quick and very easy to implement, not to mention if you’ve ever used update panels before you already know 90% of what’s needed. This can also only work in situations where you have a good search that is going to return the result quickly, rather than leaving the user sitting there trying to work out why nothing’s happening and where the search button has gone.

Implementing Real Time Search

For this example I will be filtering a table from a DB based on the search criteria and refreshing a Grid View with the results. I will be using a normal C# Web Site project with the Adventure Works sample DB from Microsoft. DB connection will be done using LINQ to EntityFramework, however there is no need to use this it is just my preference for the example.

First off set up you’re website and db and make sure both are working with no problems. As results will be displayed in an Update Panel, get one of these along with a script manager added to your page, so it looks something like this:

<form id=”form1″ runat=”server”>
<div>
<asp:ScriptManager ID=”ScriptManager1″ runat=”server”>
</asp:ScriptManager>
<asp:UpdatePanel ID=”UpdatePanel1″ runat=”server”>
<ContentTemplate></ContentTemplate>
</asp:UpdatePanel>
</div>
</form>

Next let’s get the search working in the normal method, so I’m going to create my Entity Model and add a textbox and gridview to show the results. Again you can connect and show your results however you want. You should now have something like this in your aspx file:

<form id=”form1″ runat=”server”>
<div>
<asp:ScriptManager ID=”ScriptManager1″ runat=”server”>
</asp:ScriptManager>

<
asp:TextBox ID=”txtSearch” runat=”server” OnTextChanged=”TextChanged” Text=””></asp:TextBox>

<
asp:UpdatePanel ID=”UpdatePanel1″ runat=”server”>
<ContentTemplate>

<
asp:LinqDataSource ID=”LinqDataSource1″ runat=”server” onselecting=”LinqDataSource1_Selecting”>
</asp:LinqDataSource>

<
asp:GridView ID=”GridView1″ runat=”server” AutoGenerateColumns=”False” DataSourceID=”LinqDataSource1″>
<Columns>
<asp:BoundField DataField=”ProductID” HeaderText=”ProductID”
ReadOnly=”True” SortExpression=”ProductID” />
<asp:BoundField DataField=”Name” HeaderText=”Name”
ReadOnly=”True” SortExpression=”Name” />
<asp:BoundField DataField=”ProductNumber” HeaderText=”ProductNumber” ReadOnly=”True” SortExpression=”ProductNumber” />
<asp:BoundField DataField=”Color” HeaderText=”Color”
ReadOnly=”True” SortExpression=”Color” />
<asp:BoundField DataField=”SafetyStockLevel” HeaderText=”SafetyStockLevel”
ReadOnly=”True” SortExpression=”SafetyStockLevel” />
<asp:BoundField DataField=”ReorderPoint” HeaderText=”ReorderPoint”
ReadOnly=”True” SortExpression=”ReorderPoint” />
</Columns>
</asp:GridView>
</ContentTemplate>

</asp:UpdatePanel>
</div>
</form>

And this in your code behind:


protected void LinqDataSource1_Selecting(object sender, LinqDataSourceSelectEventArgs e)
{
Model.AdventureWorks2008Entities AdventureWorkds = new Model.AdventureWorks2008Entities();
var products = from p in AdventureWorkds.Product
where p.Name.Contains(txtSearch.Text)
select p;
e.Result = products;
}

Next its time to add the Real Time Search. Make sure you have the dll downloaded (you may need to compile the download to get it) and add it to your bin folder. Add the following to your page in the relevant places:

<%@ Register Assembly=”RealTimeSearch” Namespace=”RealTimeSearch” TagPrefix=”cc1″ %>

<cc1:RealTimeSearchMonitor ID=”RealTimeSearchMonitor1″ runat=”server” AssociatedUpdatePanelID=”UpdatePanel1″>
<ControlsToMonitor>
<cc1:ControlMonitorParameter EventName=”TextChanged” TargetID=”txtSearch” />
</ControlsToMonitor>
</cc1:RealTimeSearchMonitor>

Important things to notice here are the AssociatedUpdatePanelId which tells the control what it has to refresh and the controls to monitor section which sets what the control to watch is called and the event name that will be fired when the post back is created. You will now need to corresponding control in your code behind like so:

protected void TextChanged(object sender, EventArgs e)
{
GridView1.DataBind();
}

Run the site and  you should find that the grid view now updates as you type (all be it with a slight delay).

To improve you can add other controls like update progress to show something happening which will help with any delays in displaying the results.