There is an interesting article over at UXBooth.com discussing those times when you are in a client meeting, awaiting final approval, and all of a sudden the big wig comes in, scrunched up piece of paper in hand with new directions on the site for.
“Put this link here, rearrange the navigation, change the colours of these buttons…” – the list just goes on and on.
I guess we’ve all had a few of those meetings and know how incredibly frustrating it can be after months of research and development to have someone who doesn’t really understand usability and user experience come along and tell us to change things. Of course, as Oliver rightly points out in his article, this person is our client, is paying our bills and may well be someone we want to work with in the future.
That said, we don’t want to give up a better, more usable design just to please this person, so how can we show them which design works best? Read on to find out how we do it.
How do we avoid these problems?
I recently wrote about “The key to successful collaboration” which covered some of the things you need to remember when working with clients, namely keeping communication lines open, and making sure decision makers are included in the loop. I suggested using tools like IntuitionHQ.com to help simplify that process as seen in the graphic below:
Of course, in a perfect world the client would be in the loop the whole time and you wouldn’t get these last minute surprises. Unfortunately, for one reason or another it doesn’t always work out this way and you need to have a solution when you get to these crunch times when the client throws these new ideas at you at the end of a contract.
How can we solve the CEOs design dilemma?
I’m happy to say we’ve recently introduced a new feature to IntuitionHQ that addresses this very problem. A/B usability testing.
I guess most of you are familiar with the concept of A/B testing (and if not you should read this ultimate guide over at Smashing Mag), and I’m sure most of you have seen A/B tests used to test conversions on sites post launch. A/B testing has done this task quite admirably, but for us usability testing folk the process wasn’t as straight forward. Conversions are one thing, design and usability are another.
With our A/B testing system, just as with all IntuitionHQ tests, you can simply upload a screenshot or wireframe, share it with whoever you like (we quite like sharing it on our twitter feed and facebook page) and watch as the results come in.
The beauty of this is you can take the ‘wireframe’ that your client drew up for you (you could even just scan it in), put up one of your wireframes or concept sketches, and test away. It takes minutes to set up and send out this kind of test, and can instantly produce quite overwhelming results.
Testing and results: What does it all mean?
Let’s say you got the following results from an average user:
- 5 seconds to add a product to a cart with your design versus 10 seconds for the CEOs
- 80% success rate for your design vs 50% for the CEOs design
- 8 seconds average click time with 90% success for your design versus 6 seconds and a 60% success rate for the CEOs
In this situation it’s plainly obvious which design is working better, and this information can rebut any design decisions that uninformed people might try and make. It can help to validate your designs as well. Testing can really be a good friend to you.
We have a bit of a concept wireframe up at the moment for the main Boost site which we are doing a bit of a redesign of, and we are testing it against the old Boost site.
You can have a look at that test at https://boost.intuitionhq.com/boost-homepage-redesign to get an idea of how testing wireframes might work. As I say, you can even do an A/B test with a wireframe and a screenshot against each other. If you’d like to see some results from one of our A/B tests, you can check out this page. Quick, easy, and concise.
If you’ve got any thoughts on this we’d love to hear them. And why not try it out yourself – you can sign up for a free account at IntuitionHQ.com and create some sample tests.
Good luck with your testing – we hope you get as much out of it as we do.