Usability testing is a simple way to gather feedback from real users and identify issues that are hurting their experience on your website or application. Just write the test script, send it out to your target users, and then explore video and written results to understand what’s working and what’s not.
The design of your test has a big impact on the accuracy and quality of the results. A well-written test can gather valuable and relevant feedback, but a flawed test may produce results that are skewed, misleading, or outright false.
Here are some of our top usability testing tips for creating and running an effective study.
Write an immersive, relatable scenario
Ideally, you want your testers to adopt the mindset of someone using your product in a real-life situation. To achieve this, the scenario you write for your test should be detailed and realistic. Try to be general enough that it’s relatable to many people, but specific enough to evoke familiar situations testers can furnish their test experience with.
Test user impressions
Your landing page is a billboard for your brand. A good way to measure whether it is projecting the right message is an Impression Test.
At the beginning of your test, we show the testers your landing page for 15 seconds (the average amount of time visitors spend on a webpage) and then ask them to say what they remember from the site, how they would describe it, and what products or services it offers. It’s a great way to gauge whether your page is on target, identify the cues that guide or obstruct visitors’ understanding of the site, and help orient the testers as they start their journey.
Design a coherent user journey
Although your testers will be led through the test by clearly delineated tasks, it should mimic a real-life user journey as much as possible. Consider how visitors typically progress through different pages and steps on your site. Use what you know to write tasks that imitate the natural progress of a real-life user, and not just a jumbled collection of steps.
Be mindful of tester fatigue
Don’t overstuff your test with too many tasks, or testers will get tired out by the end and the quality of their feedback will degrade. Your task list should fit comfortably inside 15-25 minutes. Avoid including more than 10 tasks, and ideally closer to 5.
Use an accessible, straightforward voice
It can be easy to slip into “brand-speak,” talking about your website or product in a way that doesn’t make sense to new users. Remember that they will probably know nothing about your product and how it works. Steer clear of jargon; it will only confuse your testers, and can sometimes cause them to misinterpret tasks and undermine the usefulness of the test.
When you write your tasks, use simple and generic wording and think hard about what will and will not make sense to first-time visitors.
Don’t lead the witness
When writing tasks, avoid key words used on your application – names of buttons, features, pages, and so on. Focus on telling testers the end goal to be achieved, not the action to take. For example: “Save an item you like so you can come back to it later” instead of “Add an item you like to your wishlist.”
Not only will this show you how easily the user identifies and locates the way to complete the task, it might reveal a totally different way they think about achieving that goal. Instead of being a self-fulfilling prophecy, your task will produce actually meaningful results that may even surprise you.
Observe, don’t ask
If you’re looking for the answer to a specific question, think twice about asking testers directly. You’re likely to get a more accurate answer by steering them through a relevant task and observing what they say and do. This way, the testers are less likely to filter their thoughts because they are responding directly to the stimulus instead of trying to analyze their own thoughts and preferences.
Run your test with at least 5 people
A general rule of thumb is that you can identify about 80% of all usability issues with 5 testers, and virtually 100% of issues with 15 (due to the diminishing marginal value of each tester’s feedback). If collecting reliable quantitative metrics is important for you, you’ll need a bigger sample size. Using at least 20 testers returns solid data with a confidence interval of roughly ±19%.
For larger batches of testers like this, there are many usability testing tools that can help you to analyze the larger amount of video and written feedback in a timely fashion.
Test your competitors’ websites
You aren’t limited to testing your own website! Running tests on your competitors’ websites can provide valuable insights about what they’re doing differently, and what users prefer about their product. Comparative usability testing can also be a way to benchmark your performance.
Finding usability testing tips when you need them
Previously, we wrote about some tips on how to write a good usability test on the blog. We have now incorporated some of those tips into a Tips & Pointers popup that you can access whenever you are creating a new usability test or editing one you have previously created, on the upper right hand corner of the window:
When you click on this link, an overlay window is displayed showing the tips:
Please use these as you create your tests to get more out of your research.