AVO agency

View Original

Remote Content User Testing: Step-by-Step Guide

The complete guide to user testing content remotely

How do you know if your web content is easy to use, or causing friction? Moderated remote content testing (wow that is a long name) is a great way to see how your content is actually performing with real people. To do it, you invite real people who fit your user persona to use your website, observe their experience, then improve your content based on what you learn from them.

Since we live in these wild corona-times of 2020, it’s safe to assume you’re only going to do this remotely. Fortunately, we’ve been doing remote content testing for years, and have learned a thing or two (or ten).

In this comprehensive guide, you will learn:

  • The benefits of testing content with real people

  • When and why to test (and when not to)

  • 2 essential types of user tests for content

  • How to run user testing: step-by-step process

  • User testing mistakes to avoid 

The benefits of testing content with real people

Usability testing is a user-centered method of testing a user experience with real people. The goal is to identify usability issues that would otherwise go unidentified. Usability tests are often done by product designers and software engineers, but can also be used to evaluate the usability of content.

Content plays an essential role in making an interface easy to use, whether it’s a website, a mobile app, a voice user interface, or any other type of experience. 

Identify content usability issues

Testing interface content with real people has a ton of benefits. Fist, and most importantly, it helps you identify content that is making it difficult or confusing for someone to use your interface. This could be a label or term that doesn’t make sense, or a series of onboarding messages that leave someone unsure of what to do. You might also just find that you have too much, or not enough content for the situation. 

Better understand your audience

Second, testing with real people can give you a more nuanced understanding of your user persona. You’ll get to hear how your audience thinks, how they describe things, how they tend to use and explore content, and even how they feel. You can use these insights to write better content and validate or improve your user persona.

Uncover the “why” of user behavior

Finally, usability testing gives you qualitative insights to complement your quantitive data. Web analytics can tell you what people are doing on your interface, but not why they are doing it. Qualitative research like content usability testing can help you potentially understand the “why” behind people’s behaviors.

When to user test content (and when not to)

Being a great content strategist is not just about having all the tools; it’s about knowing which tool is best for a given situation. Usability testing is great, but it’s not the answer to every research question. If you’re working with a new brand who doesn’t yet have a a clear idea who their audience is, then it’s too early to do usability testing—even if they have a website already. You’re better off stepping back and doing something like demographic research or user interviews, so you can start to build a persona.

When to do user testing:

  • Early on in a project to avoid costly mistakes

  • You have an existing website or app, you’re ready to invest into improving the experience, and you want some evidence to inform your re-design 

  • You have a specific problem—like abandonment at a certain point in a sign up flow—and you want to know why it’s happening

  • You are launching a new website or app, and want to see how it works before you set it free into the world

  • You want to understand how users complete tasks within your user experience

When not to do user testing:

  • When you’ve already spent months building a website/product and now you want users to validate your idea (this is way too late!)

  • You have two concepts and want to compare them (just do an A/B test instead here)

  • You want to gather a large amount of data from lots of people (more than 10 user tests isn’t a good use of time, you’re better off doing a survey)

  • If for some reason you have 48 hours to test your concept, with no time to plan ahead or synthesize your findings

2 essential types of user tests for content

There are many ways to do user testing, and you can probably find at least a dozen different methods on the Internet if you look hard enough. But I’ve found that most successful content user tests tests fall into 2 categories.

Testing usability 

Testing usability tells you how a user completes a certain task or series of tasks, so that you can design content that makes those tasks easier.

Examples: 

  • You are writing the microcopy for a mobile app, and you want to know if the onboarding process will be easy. You would then design a task-based user test, where you ask people to go through the onboarding flow and observe their experience.

  • Your website has a help center, and you want to know if people can easily search for and find the information they want. You could design a user test where you have them start in the help center, then you ask them to find the answer to a question.

Testing comprehension 

Testing for comprehension tells you whether or not your content is clear and easily understood by your audience. This will allow you to write and structure your content in a way that is easier for people to comprehend and remember.

Examples:

  • You have a sales page for a complicated product or service, and want to know if the content clearly explains the offer and benefits. You design a test where users read the sales page, then paraphrase what they read and explain it back to you.

  • You are in the exploratory or discovery phase of a website redesign. To test for content comprehension of 4 key pages, you do a highlighter test. Users highlight information they like and understand in green, and highlight confusing information in red.

How to run remote user testing step-by-step

Step 1: Create a user testing plan

Planning ahead is an important and often rushed step. To get the most out of your user testing, first set clear goals about what you’re testing, why, and what the scope of the test will be. 

High-level considerations for your user test plan:

  • What are our goals? What do we want to learn, and what questions do we hope to answer? Are we testing for usability, comprehension, or a mix?

  • Who are our ideal participants? What disqualifies someone from participating?

  • How many participants will we need? How will we recruit them?

  • What can we tell the participants ahead of time to prepare them? What should we not tell them?

Defining the specific scenario for your test:

What will users do during the test? Plan out the scenario and set guardrails.

For example, if you’re testing a set of tasks, get very clear on the scope of those tasks and the desired user paths they will take.

If you’re testing for comprehension, think about what level of understanding you hope users will have and for what specific content. 

Personally, I like to create my user test plans on Notion.com. There’s just something really clean, functional, and simple about it. It’s a bit prettier and more organized than a Google Doc, but not overly complicated either.

Step 2: Recruit the right participants

As part of your preparation, you’ll also need to get clear on who you’ll need to test with. The more specific you can get, the better. Recruiting the wrong people for your test can derail everything. 

In most cases, 5–8 testers is plenty (after about 8 tests, you’ll probably start getting the same answers over and over). Unless you’re testing with friends, you’ll want to budget to pay each participant for their time in some way. You can literally pay them, or offer something like an Amazon gift-card. 

You’ve got a few options when it comes to recruiting:

  • Use a recruiting agency. This is more expensive, but a good option if budget isn’t a concern. 

  • Recruit from your existing customer base. Send out an email to existing customers and explain the offer. Make sure to include some kind of screening survey, though.

  • Recruit and pay people on the Internet. You can find people in Facebook groups, Slack communities, or anywhere else your persona might hang out. Again, make sure you pre-qualify them with some kind of survey or questionnaire.

  • Ask people you know or in your extended network, if all else fails and you have zero money. This really only works if your network happens to align with your target audience, OR if your audience is really broad and you want more general usability insights.

Step 3: Test your tech

When running remote tests, you’re going to want to test out your tech in advance. There’s nothing worse than having a user tests go wrong because fo something silly like a tech fail. 

Rehearse and test out the tech you’re using internally to make sure it works. You can also think about what might go wrong with the tech and how you’ll adapt.

Step 4: Run the remote user tests

It’s time! Now you just need to run the tests. Hopefully, you had created a detailed plan for yourself, and all you need to do is stick to the plan as much as possible.

To avoid as much bias as possible, make sure you’re not asking leading questions as you interact with participants, and that you’re careful of paraphrasing or giving them answers. 

Step 5: Synthesize what you learned

Synthesis is a crucial part of the any research or testing method. You need to be able to take a critical look at your test results to find themes and patterns that will give you action steps. We love this 4-step synthesis process by Daniel Klein, that uses affinity clustering to create insight statements.

Mistakes to avoid for user testing content

No user test is absolutely perfect, but it’s our job as the user testing facilitators to make it as good as possibly can. Although there are a lot of potential mistakes you can make—like failing to plan ahead, recruiting the wrong participants, or rushing through synthesis—the less obvious mistakes can be the most troublesome.

Failing to examine bias

No one is completely free of bias. What’s dangerous is when we are completely unaware of own biases and assumptions, and we let them seep into our work unnoticed. Biased user testing leads to biased results, which can lead to biased user experiences that might be racist, sexist, inaccessible or otherwise exclusionary in some way.

Essential resources for tackling bias in content work:

Interrupting and disrupting

Especially during remote moderated user testing, it’s easy to accidentally help our participants too soon if they are struggling. For example, we’ve asked them to complete a task and they can’t figure it out, and they ask us how to do it. Answering them immediately is not the point of the test. We’re not there to help them use our crappy product, we’re there to observe so that we can learn why it’s crappy and then we can fix it. I like to say that interrupting is disrupting, because it rhymes and it’s true. Try to stay out of the way as much as possible, even though you’re on a somewhat awkward Zoom call with a stranger who is asking you for help.