Every couple of days there is a new blog post saying "Listen to your users" or "Don't listen to your users" (wait... what?); however, regardless of their conclusion, there is no way to follow a User-Centered Design without talking to your users... a lot...
The word User Research can be a monster in itself, just try googling it and you'll find an endless cacophony of "10 simple steps" and "revolutionary methods". The important thing to understand is that one, there are hundreds of techniques that can be applied, and two, you don't need to use them all in a project. You should choose and adapt the methods that are best fitted for yourself and your project and the ones that yield the results you need for your design process.
In this post, I'll go through the process I've applied in a real project here at Whitesmith; explaining what worked well and not so well, and providing some examples of actionable feedback we got from users. I hope you learn something new or interesting. (ps: if you did, let us know @whitesmithco!)
Who are your users?
The first sign we needed some User Research on this project was when I realised I knew nothing about the target user. This meant I was not able to understand the goals, fears and frustrations of this type of user.
Tip #1.If you are not familiar with the user profile of the product you probably need to either interview some users or do some field work.
Luckily, this project was a revamping of an existing product, meaning that the Client already had a working product and established connections with a couple of their users. When you are not familiar with a type of user, a good method to apply is an interview because, although it is important to have a script, you can explore more into the goals and pains of users if required. It is not a strict technique.
Tip #2.When recruiting users for research it's always best to get existing or at least possible users of the product. When that is not possible, try finding analogous areas with similar skill requirements.
This product had two separate user profiles, each with its own needs and frustrations, so we needed two interview scripts.
The purpose of the interview script is to provide some structure to the interview, with starting questions to fuel the discussion.
It should include an introductory explanation and a closing thank you note; it's important to explain that there are no right or wrong answers since we are only gathering insights to improve the user experience.
As I said, the script should be used as a starting point for discussion; however, if you are talking more than your user, you are doing it wrong... Give your user a topic and let them explore further, if they don't have anything else to say try the 5 whys technique.
Tip #3.Listen, listen, listen... did I say you need to listen yet?
A good way to design the interview script is to follow the User Journey or Flow (the path from when the user first comes in contact with a product till the last moment), try to discover what triggers each step and when, how often and with what urgency. It is also good to ask if they are familiar with competitors and why they choose this product instead. Similarly, if while doing the interviews you start to notice that different users employ different labels for the same concept, it is a good idea to try to build a shared conceptual model between users. The key aspect is to not have questions that can be answered with a yes/no, making the user always have to explain their reasoning and motives.
When doing the interviews, try to have at least another person with you. This way you can be focused on the interview (managing the script and follow-up questions) and your colleague can be taking notes. Taking good notes is an art in itself; in my experience, what works best is to tell your partner to take notes verbatim, or as close as possible. It's incredible how different two interpretations can be of the same sentence said out loud...
Tip #4.There is no magic number (
5) of users you need to interview to get enough insights.
There is a famous quote, popularised by Mister UX Nielsen, that you only need five users to discover 85% of issues, and fifteen to discover all issues. I should say this is actually a very controversial quote in the UX community, because how can you know if you got "all" the issues... And the same applies to interviews, how can you know you interviewed enough people? My rule of thumb is to organise interviews in rounds of five users. If after a round you realise the feedback you got was pretty much the same as the previous round you stop. This also helps you improve the script in between rounds if you discover a new topic or relevant question.
After the interviews comes the arduous task of extracting actionable insights from it. I like to divide the transcript into individual comments and summarise them (hopefully without changing the meaning...), and then group them into three categories: Pains, Gains and General notes. The Pains include possible apprehensions, current dissatisfactions and tensions, while Gains are composed of desires, requirements and expectations the user might have towards this product.
Tip #5.Group your insights into Pains (Fears, Frustrations, Anxieties), and Gains (Wants, Needs, Hopes).
Having the user comments organised into groups also makes it easier to detect similarities, allowing you to merge notes together. Usually, I sort the notes by the number of times mentioned (with the number before the remark). This number comes in handy when you need to select which insights to turn into features and which ones to ignore. It's also common to discover that some features are actually not something users expect from the product, while others are actually crucial.
Tip #6.Always take the feedback you get with a grain of salt, especially if only one or two users mentioned it.
Overall, this process yields highly relevant insights about your users and your product, with both confirmation about issues you had identified before and new aspects to take into consideration when designing the product.
These interviews were conducted during the discovery phase of the project before completing the visual design of the project. Next time, I'll be sharing the Usability Testing Process I went through to evaluate the mockups created for this product.
Cover photo credits: The Commons