Observing user research - DOI-ONRR/nrrd GitHub Wiki

Your job, in a nutshell, is:

Listen to the interview without interruption
You will have a chance to either ask questions directly at specific points during the interview, or send the moderator questions that they may weave in through a back channel.

Take notes
Write down as much as possible as quickly as possible, using the user’s own words and phrases (don’t paraphrase). This is critical because it allows us to capture quotes, which can be really helpful in conveying what we discovered to other stakeholders. If it’s a remote test, you probably want to turn off your microphone to mute the keyboard noises and video camera to not overwhelm the participant with observers.

Specific tips

Listen
In general, the participant should be doing most of the talking. Listen carefully and try to take notes using their own words (instead of paraphrasing), and in first-person.

Be neutral and disinterested – this is not a sales pitch
You may be excited about your product, but if you show your excitement you will bias the feedback you get. Users generally want to be nice and protect your feelings (even if you explicitly ask them not to). The more they sense that you are invested, the less likely they are to point out problems.

Be unhelpful – this is not training
If a user asks you “How do I complete this task you asked me to do?” try to avoid giving away the answer (but don’t be rude). The moderator will usually ask the user, “How do you think you might do it?” or “Is there another way you might try to complete the task?” This process can be uncomfortable, but resist the temptation to jump in and help.

Resist confirmation bias
If someone says something that contradicts your own hunches, it can be easy to unconsciously disregard those comments. Try to write everything down.

The goal of interviewing users is to uncover the underlying user needs, not to document feature requests
Listen to how people are thinking about topics and what criteria they use to come to conclusions, not necessarily the specific desires they voice.

Take notes on body language and facial expression
If a user frowns, sighs, shakes their head, laughs, leans in, leans out, hesitates, squints at the screen, etc., pay attention to that and write it down.

Write down what users do and don’t do, not just want they talk about
Write down things like if the user skipped the navigation instead of using the search bar, if the user clicked the wrong thing first in completing a task (and what they clicked on and why), if the user asked for help instructions in using the design, if they make wrong turns, etc.

Ask questions when the moderator gives you an explicit opportunity
The moderator will sometimes trail off, use awkward silences, or play dumb to extract additional feedback from the user. It’s important not to interrupt this process. Jot down your questions and wait for a specific invite from the moderator.

People are contradictory (aka users lie). Users tend to be very bad about predicting their own behavior
Asking, “Would you [do this or that]?" is almost always going to solicit a ‘yes.’ You’ll generally get more accurate results by asking about past behavior than asking about hypotheticals.

Don’t take every word as gospel. These are just the views of a couple of people
If they have good ideas, great, but trust your intuition in judging their importance, unless there’s significant evidence otherwise. So if someone says, “I hate the green,” that doesn’t mean that you change the color (though if everyone says, “I hate the green,” then it’s something to research further).(1)

Usability tests are not statistically representative
If three out of four people say something, that doesn’t mean that 75% of the population feels that way. It does mean that number of people may feel that way, but it doesn’t mean anything numerically.

Sources:

  1. Observing the User Experience: A Practitioner’s Guide to User Research (2nd Ed.) by Elizabeth Goodman, Mike Kuniavsky, and Andrea Moed.

  2. Usability Testing Pocket Guide by Dana E. Chisnell and UsabilityWorks.