A couple of weeks ago I was asked if I would like to write a blog post for our company website about the different tools that allow you to create a personalised photo calendar to send out to your family and friends this Christmas. Maybe you have read it?
This was basically a mixture of a UX audit and a competitor review so I thought it might be interesting to explain what I did and how I came to the recommendations I did.
The first thing to do was to find some sites to test. Google and my team were very helpful here and I soon had a list of seven to test.
Next I thought about who would be using these sites. Although anyone could be, I decided it was probably going to be busy parents who do not have time to try and work out how to use this tool and might need to save it mid way through if family life gets in the way. Although not everyone who uses them would be in this situation, working on the basis that people in general do not have time to try and work out complicated systems and might want to save their work, this seemed like a good starting point.
From here I needed to come up with a list of things I would expect the sites to do. Could I easily see what size calendar was available? Could I save my calendar mid way through the process? Was it easy to see how much it would cost as I added things? Could I do it on mobile?
With a massive list of tasks I then had to work out a scoring system. I broke it into three - Yes I could definitely do the thing, gave a score of 4; Yes I could partially do the thing, gave a score of 2; nope gave a score of 0. This meant that each calendar tool could score a maximum of 120 points.
Now I had a list of things to test, all I had to do was work through each calendar tool to see what it would do. This also meant that I had somewhere to write notes for myself as I worked through.
I also took screenshots as I worked through to help me remember the differences or particular points I wanted to make, and to use in the article.
By giving the tools all scores for the same small interactions and tasks someone would complete using them I ended up with a robust result which was more about the score than my personal opinion.
The actual testing was pretty straight forward as I worked through my sheet of questions for each tool and scored them.
I found it interesting that a couple of tools got reasonable scores though I found them hard to use, but the system did work and the tools that were really good got the best scores while the ones to avoid came out bottom.
It was an interesting piece of work which really got me thinking about the users needs when using a creative tool like this. Not everyone is an expert or wants to be an expert. They just want to create something nice to share with others.
I also love how versatile this system of scoring is. The idea can be used to look at any site or tool just by changing the criteria for testing each time. I know I will be using it again.