One of the most frequently asked questions we receive here at IntuitionHQ is from users who are interested in A/B testing, preference testing and usability, but who don’t know what kind of test is right for them, what kind of results they could expect with each kind of test, or even how to get started with usability testing in the first place.
We addressed the last issue in a previous post, ‘How to get started with usability with usability testing in seven simple steps‘, but the first two questions can still be quite confusing. It’s worth addressing the differences between these two types of tests, because both can be very useful for gathering certain types of information. Read on to read what kind of information each test is best for, and help learn what is the best kind of test for you.
A/B and Preference Testing for Usability
A/B Usability Testing
A/B Testing is a very popular method for testing variations in live sites; you can have two different variations of a text, button or whatever and find which one works best. This is achieved by sending roughly 50% of the sites traffic to the different vartiations (either A or B – not both) and seeing which one works best. Two useful options for this are Google Website Optimizer and Visual Website Optimiser – both of these tools are very handy for testing on live sites.
A/B usability testing is a very similar concept. Whatever ideas you have to test, whatever variations you can think of, you just upload them, set a task, and see which one works best. The great thing about this is you don’t need to make changes to your live site, and it’s easy to add multiple tasks to test different variations.
A/B usability testing is great for testing alternatives, for example navigation structure, button designs, button locations – basically anything where even a small variation could make a difference to the end user and the usability of your site, service or product.
Preference testing is when two images/wireframes/screenshots are shown side by side, and users can make a choice on which one they prefer based on the test criteria that you set for them – generally along the lines of a ‘which design do you prefer’ kind of basis.
Preference testing is really useful for testing a range of different things: it can help you to better understand conventions in design (as shown in our ‘The User Experience and Psychology of Colour’ article on Spyre Studios), for looking at preferences across cultures, for understanding how small differences can affect your users and so on. As with A/B usability tests, there is a huge range of different things you could test in this way, and ways that your site could benefit from this kind of testing.
Some examples: A/B and Preference tests
Of course, it’s easy to talk about these different testing methods, but the most effective way to understand what they really mean is to see them in action:
One great example of this is a test we did comparing Bing and Google. Obviously the objectives of the two sites is very similar, but the way they are structured is quite different. We wanted to find which one worked better – not about search results, but in terms of usability and optimisation.
We looked at some of the more common goals that both of the search engines would be trying to achieve. In the example above, we’ve used finding advertising as part of our test – advertising being a crucial part of the business for any search engine. In the first part of the test, users are either directed to either a screenshot of Bing or Google, and try and find the link to advertise with that provider.
When looking at the results (shown above – click to enlarge) we can see that not only is the average click time for Google much lower in this example (just over 7 seconds to almost 11 seconds), but also that the success rate of users finding the correct button is much higher (77% for Google versus 65% to Bing).
What this means is that Bing should really focus more on the location of this button, and trying to make it more visible. Obviously the easier, more straight forward it is to advertise, the more people will advertise. Even a 1 or 2 percent bump to their advertising spend would mean significant revenue for the company.
We recently wrote an article over at Spyre Studios talking about The User Experience and Psychology of Colour. It was based on the colour of labels they use at Clicky, a web analytics tool – where they often use red to display confirmation/positive messages as well as for failure/warning messages.
What our testing showed with quite an overwhelming majority is that most people (88% in our tests) associate red with failure.
It also showed that red was the colour that stands out the most, although as I argued in my post, this doesn’t make it a good reason to use it for all types of messages. Certainly if it’s overused, it will stop having much effectiveness.
And indeed, what we can see for the above image is that most users think red makes the most sense for warning text – a huge 97% of users chose red for warning text. Another question in the test had 74% of users choosing green for their success message. You can take the whole test here, or view the results here.
While these might be the results you’d expect in this example, there are many times where based on culture, gender, age or a number of other factors that people would choose different options in this type of testing. If you want to make your website as effective as possible for your audience, you should really perform this type of testing.
So now you (hopefully) know the difference between A/B usability testing and Preference testing, you can work out which is the best option for you next time you are getting ready to test.
This brings us to the last point, which is frequency of testing; obviously in the earlier stages of design and development there is more testing to be done, but a key point to remember is constant, consistent usability testing to monitor for small changes in the internet psyche, and to adjust to the ever changing network. If you ever have an inspiration on how you could improve your site, test it and see how it works in reality. It’s never a bad time to test.
As I’ve quoted time and time again, ‘Build it and they will come; build it well and they will come back’. The better, more usable, more enjoyable your website, the better it will be for you and your users, and the more successful you and your site will be in the long run. How can you afford not to test? Why not head over to our homepage and sign up for your free IntuitionHQ account today. It’s always a good time to get started with testing.
Do you have questions or comments about this article, or usability testing in general? Feel free to ask – we love to help. And don’t forget to subscribe to our RSS feed to keep up to date with all the latest news in usability.