Hone Your Customer Interviewing Technique

   UX

A few years ago Modus graciously sponsored a trip for me and several other team members to attend a lean startup weekend. If you’re not familiar with lean startup, its main purpose is to validate your business idea before sinking ample time and money into it. To achieve this goal, the mentors at lean startup weekend repetitively stressed the importance of “getting out of the building” and interviewing potential customers.

So we heeded their advice. Just a few hours into the first night, our team eagerly scampered out of the building — young trailblazers embarking on a wondrous adventure to uncover disruptive learnings.  

But we soon realized something important.

Interviewing is really really hard.

And interviewing without knowledge of best practices is really really dangerous. In fact, running a biased or misleading interview can be much worse than running no interviews at all. Doing so can return false positives and convince us to spend even more time and money on a venture that is still doomed to fail.  

But what makes an interview biased or misleading? Let’s work through an example.

One of our founders, Pat Sheridan, frequently mentors at lean startup weekends. Prior to experiencing our own lean startup weekend, he shared some cautionary tales with us. One team that he mentored had an interesting business idea: pay facebook users money to post endorsements for companies’ products and services, and in turn receive a kickback from those companies.

The riskiest assumption here is that facebook users will be willing to alienate their facebook friends to make extra cash. Yet the team’s primary line of interview questioning did not validate this assumption. Instead, they asked people “Would you like to make money simply by posting on facebook?”

There are a few crucial issues with this question. First, as mentioned, the team never even explained how the process worked. They swept the potential social cost under the rug. Second, by starting the question with “would you like…” the team was already biasing the user to answer in the affirmative. Open ended questions are almost always more insightful and less leading than yes or no questions.

In the end, we learned quite a bit from Pat and the lean startup weekend, and we employed those learnings in the years that followed. But as new members joined our design team or expanded into new roles, we wanted them to undergo the same learning experience.

We found a perfect learning opportunity in a new, internal product we were designing. As we do with any new product, we wanted to validate the business idea before actually building out the product. We used this as an opportunity to practice our customer interviewing skills. While there’s plenty written on the high level customer interview process, we’ll delve into specific learnings and interview best practices that came out of our experience. There were three of us running interviews and we each learned different things and found different areas to improve on.

JD’s lessons learned

Practice, Practice, Practice

We’ve already established that interviewing is really hard to do well. When I did my first interview years ago, I was admittedly pretty dreadful at it; most people are. It’s easy to get discouraged or even decide that interviewing just isn’t for you. But, at the risk of sounding very cliche, practice makes perfect in interviewing just as it does in any difficult task.

Now, as mentioned, there is a certain risk associated with performing actual customer interviews without first polishing your interview technique. After all, these interviews will shape your business. The thought of bankrupting your future because you biased your customer interviews is daunting. That’s why we recommend starting by practicing your interview skills in low pressure situations. Make a habit of doing practice interviews with family, friends, and co-workers before you graduate to actual prospective customers.

Check The Tape

We often record interviews just for peace of mind — so we don’t have to scribble notes throughout the interview. Drew touches on this point a bit below, but watching these recordings can teach you a ton about your interviewing technique as well. You don’t need to watch every interview you do. If you’re running multiple interviews, I would recommend watching your first session, a session in the middle, and your last session, so you can track how you have improved.

Brace yourself for some awkwardness. Watching yourself interview can be cringeworthy (“My voice sounds like that? Ewwwwww”). Pay attention to whether you’re introducing bias into the interview. Also observe your body language and facial expressions. Try to keep a relatively straight face throughout the interview — interviewees easily pick up on expressions of dissatisfaction or affirmation. I also found that I have a tendency to fidget quite a bit and break eye contact from my attendees. Here are a few gems from my interviews:

How You Talk Matters

Interviews, by definition, are a conversation between people. Just like any conversation, an interview is subject to societal norms. To put it bluntly, customer interviews create awkward social situations — there’s just no getting around it. If you’re aggressively interviewing, you may be interrupting a complete stranger’s day to ask them questions about something they potentially don’t care about. This puts the onus of developing rapport on the interviewer.

When I watched my recordings, I frequently compensated for this social awkwardness by straying from the interview guide and adopting what I felt was a more conversational tone. I found that I repetitively “hedged” my questions and statements. A hedge is a word or phrase that lessens the impact of what you’re saying. Here are a few examples:

I guess I’d like to hear more about the team dynamic during the meeting.”

“I’m somewhat interested in what materials you ask people to bring to meetings.”

“We essentially want to explore how teams, like, interact during decision making meetings.”

Unfortunately, introducing hedges into my speech made my questions become much less clear and direct. I also came off as lacking confidence, which jeopardized my authority to steer the interview. So, my recommendation is to ask your interview questions clearly and directly. Doing so is one of the harder tasks to accomplish when interviewing, so you may need to practice quite a bit before really nailing it.

Give Customers Something Tangible to Try

The lean startup methodology preaches doing as little as you can to learn as much as you can. Often, this translates into interviewing customers without showing them an actual representation of your product. This approach is great if you’re only trying to validate your value proposition. But in today’s world where convenience and delight reign supreme, your product’s user experience is an integral part of it’s value proposition. Doing interviews alone does not address the user experience’s value.

That’s why, when we interviewed customers, we started by interviewing designers about the problem space and then transitioned to having them try an actual prototype. For our first round, we interviewed ten people. Of those ten people, the two people who declined to sign up admitted that the product concept was attractive, but the product itself felt unfinished because of usability issues. I learned that user experience can be the sole factor in driving someone away from signing up, so it’s worth taking an extra week or two to build a prototype. Validating the user experience was especially important for our product because users have access to almost the entire feature set before they sign up.

Give Participants Cliff Notes

I used to start my usability testing sessions with a lengthy, verbal introduction. It covered several topics: permission to record, the purpose of the session, how to think aloud, what to do if questions come up, etc. In fact, after my first few sessions I came to realize that it was way too much information for participants to digest. They would forget to think out loud or even forget why we were doing the session to begin with.

Now, I give participants cliff notes to reference during my intro and throughout the test. These three simple bullet points hit the most important points I need participants to remember:

  1. You can’t do anything wrong.
  2. Think out loud.
  3. Give us honest feedback.
  4. I may not be able to answer questions right away.

If you’re looking for a long form script to read or tweak, you can download our usability testing script.

Use Multiple Data Sources

Whether you’re performing a traditional interview or doing ethnographic field research, what a participant says is often very different from what you observe. For example, you might observe a participant really struggle with a task, only to have him or her tell you the task was easy at the end of the session. This happens for a variety of reasons. Researchers have shown that participants sometimes over-report positive experiences in order to please the researcher. Or, participants want to avoid appearing unintelligent.

This is why triangulating, or analyzing multiple data sources, is important. In the example above, if you only asked the participant how difficult the task was and did not also observe them complete the task, you could miss a significant issue with your product.

Every Interviewer Is Different

Different approaches to interview preparation and technique will work better for different people. For example, I find it helpful to have a script I can read at the beginning of the session. It helps me make sure all participants get the same instructions and comforts me because I have a fallback if I forget what I’m supposed to say. However, for my teammate Drew, reading from a script actually makes the interview process more difficult. He prefers to memorize the script and interview guide. The important lesson is that both of these approaches are valid — do what works for you.

Drew’s lessons learned

Relax

This may seem obvious, but just like going to a job interview, user testing can be anxiety inducing. You have so much to think about when testing and want to get clear and concise results that sometimes you can get ahead of yourself. When I first started user testing I was so focused on getting through all of the questions that I often neglected empathy and treating the test like a conversation between two people. When preparing for user testing, run a few trial runs with friends and family, practice the questions a bit beforehand so you can focus more on the answers given by the user. By doing this you will give yourself the opportunity to build off of their answers and experiences.

Remember that the user is just as nervous as you, and you will need to spend some time relaxing the user as well. A common practice we have for doing this, is telling the user “You will not be graded on your performance”. We try to avoid using words during the testing session like “Test” or “Quiz” because those words can cause anxiety in their own right.

Timeboxing

You’re going to run out of time, it’s inevitable. Some of the most informative user testing sessions are a result of long tangents and building off of understanding the users experiences. Sometimes users can talk about an experience for what seems like an eternity. I’ve been a part of a few sessions where the intro questions took so long that we had to rush the user through the physical prototype testing just to get validation. This can cause conflict when comparing testing results. Giving yourself a somewhat strict timebox to ask the intro questions can be a great tool to getting through the testing session. Never be afraid to interrupt the user politely when you feel like it’s time to move to the next steps. If a specific user has a lot of valuable insights that you can’t dive into deeper, consider asking about scheduling a follow-up session to build on the conversations cut short.

Focusing on a specific meeting or situation

When testing a user, questions can appear very open ended. Avoid asking questions with words like “usually”, instead asking about a specific time an event happened, such as “The last time you were in a meeting” or “The last time you dropped out while signing up for a product”. Focusing on a specific event will direct their responses towards a realistic situation. Providing more direct feedback. Often questioning a user in a general sense will create false situations or responses in which the user will unintentionally answer with responses like “Yeah, I could see that happening”.

Building off of user responses

The hardest thing for me initially when user testing was getting past focusing on asking all of the questions. This caused me to unintentionally lose focus on their responses and I’d get anxious about how to come up with follow up questions not on my written script. A great tip JD gave to me after a difficult interview, was simply giving yourself time to think about their response by reiterating what you heard them say. This builds an understanding on both sides that you heard what they said, and in most situations causes the user to add more clarification to their initial response.

Speaking your mind

While user testing, especially with prototypes, it can be extremely important to let the user know you’re looking for them to explain out loud what they’re thinking when going through the prototype. Explain to them you want to know what they’re looking at, what they’re thinking, what they’re planning on doing. You can even ask them to explain what they expected to happen after they clicked on areas you’ve identified as being possible hang-ups, missed pathways, or unclear interactions.

The Silent Treatment

I found when testing users on a prototype, they constantly ask you questions. They want to know what buttons will do before taking the intended action. Give your best effort to not answer these questions and let the user experience their own pathway through trial and error. If you’re giving them the answers, you’re throwing away all of your hard work and invalidating any flow insights. It’s ok to remain silent when these questions are asked, but prior to having them start the session let them know that you may not be able to answer all questions during the session that they may have, as you want to see how they interact with the system.

In addition to remaining silent when a user has questions during the session is to reflect the question they had back to them. A common occurrence of this is when a user asks “What happens when I click this button?” this can be followed up with “What would you expect to happen if you click that button?”.

Record, Review, and Take Action

We always start our testing sessions with asking the users permission to record the session, mentioning that we will not post the recordings publicly and will only use them internally to better our product. Having a recording of the session is not only important for reviewing responses at a later time, as you may miss comments in your notes that you may want to revisit, but they can be a handy tool for watching your own body language as the interviewer. The first time I watched myself test a user I notices a million things I felt like I did wrong or could improve on. Of course you’re going to be hyper critical of your own actions and 9 times out of 10 the other user isn’t even noticing these. Being aware of these self improvement areas will only help lead better testing sessions in the future. Above all, have fun with user testing. There is nothing more valuable than having honest feedback from actual users who will use the product you’re creating or improving.

Matt’s lessons learned

Formulate interview questions

Preparing for a customer interview requires lots of preparation and effort. Come up with a plan which identifies the concerns, areas of interests and goals for this research. This plan is crucial because it will drive the interview questions.

Make them feel comfortable and in a conversation

When kicking off the interview it is ideal to set the scene and make them feel welcome, that way it is easier for them to open up to you and talk getting as much feedback as possible. You want to start off with a few warm questions so that the customer does not feel he is in an interview or a test.

“Where are you from?”

“What is your job role?”

“How often do you shop online?”

Start with broader questions related to the upcoming session, that way you can build up some trust and they start to feel more comfortable in the information space, letting them open up and present cultural perspectives.

Answer questions, with questions

During a usability test, it is very easy for the customer to get stuck and the first thing they do is turn over to you and ask questions for what to do next.

“Should I click here?”

“What should I do now?”

It is very easy for us to fall into the trap and tell them how they can continue through prototype but try take advantage of the situation and ask them questions to learn why they got stuck.

“What do you think you should do?”

“How would you search for help, in this case?

Product perception changes

Before conducting user tests and starting to gather feedback your mind is already set on the idea and you start believing that this could be the next big thing. After conducting a few interview tests, you start finding multiple flaws and receive plenty of criticism. Your perception towards the product you are building really starts to change and that is totally ok; never let passion and enthusiasm blind you. If you are stuck or if you see that the current product flow has less of a bright future, find ways to enhance the idea or pivot to a different idea. Continue to iterate and keep collecting feedback by further testing.

 


Like What You See?

Got any questions?