We're hiring!

We're actively seeking developers and designers for our Ann Arbor & Detroit locations.

Practical User Interviewing

We (Dave Crosby and Jason Porritt) recently had an opportunity to invest in user interviews for a new product idea we’ve been discussing around the office. We (developers) are not necessarily the target audience of this application, and we weren’t certain that the basic premise would be useful to those who were. Rather than becoming an obstacle, this ignorance left us with a clean slate, and a real opportunity to let our ideas take shape around the actual needs of our intended users.

Working on short notice and a tight deadline, we made some quick contacts and hit the streets. The following are a few practical points we learned.

Send a pair

At an In-house UX Workshop run by Lane Halley and Jeff Patton we learned about the simple-but-effective technique of pair interviewing: the primary interviewer engages the user and asks questions while the backup interviewer takes notes on cards. This technique saved time and kept the conversation flowing, freeing the primary interviewer to follow emerging topics and flow back into the script at will.

(Noob note: Dave wishes he’d rehearsed a two-sentence intro to the pair interview process before sitting down with the user.)

Keep the script simple

We found it very easy to write a bulky, two-page script for our interviews, and that was for a relatively simple application. As others have suggested, we used a variety of open- and closed-ended questions, but we found that the length of the script made it unnecessarily difficult to track what ground we had already covered.

In most interviews, conversation flowed naturally from the open-ended questions to cover the areas we were most interested in. Gentle corralling of the dialogue was easy because we were generally familiar with the area we wanted to cover. Very specific questions seemed to contribute most to the clutter and caused us to pause awkwardly a few times while we hastily scanned for the next item to cover. This leads us to the next point…

Reduce or separate detailed questions

We had too many detailed questions scattered among our higher-level discussion leaders. In some cases, the detailed questions could have been grouped at the beginning or end of the session (e.g., what browsers do you use?). Others could have been left out entirely as they were merely trying to guide the conversation—something that seemed to work best when the interviewer tailored the guidance more dynamically to the discussion already in motion.

Next time, we would likely reduce the number of very detailed questions or separate them from the script as a brief survey or cheat sheet.

Set a time limit

One of the other challenges of the interview process was keeping on schedule in the midst of quality conversation. We interviewed busy people who were graciously taking an hour or more out of their day to talk with us, so it was important to respect their time. In our third interview, we set an alarm for 30 minutes so we’d notice when our time was half over. It helped shorten the interview, and we still covered all the information we hoped to.

Don’t demo right away

We had a demo of the application along with us—both in paper form and on one of our laptops. Our willing subjects knew a bit about what we were going to propose, but we didn’t want to show the demo at the very beginning because we were wary that it might color early conversation. Our first questions were meant to add context to later, more targeted questions and we gained useful insights because they weren’t talking about our application yet.

Presenting details and the demo close to halfway through the interview left sufficient time for us to discuss specifics of our concept (about 30 minutes). We felt like this was about right.

Let them have it

As a way to thank the interviewees for their time, we offered free, supported use of the tool in our upcoming closed-beta release. The benefit of this is two-fold: our users get exclusive access to a new, potentially useful tool which they will have some influence over as it develops, and we get beta users with whom we’ve already established good communications. Our subjects seemed genuinely pleased with this offer.

Conclusion

Our goal for this round of interviews was to validate the basic premise of our product idea. To this end, we spent time with a small group of trusted contacts and really focused on understanding their process, their pain points, and whether our proposed tool could be of use to them. We successfully captured the data we needed and, happily, found affirmation of the product idea itself. Armed with these lessons, we expect an even better experience next time.

Jason Porritt (39 Posts)

As a software developer at Atomic Object, Jason has used .NET, RubyMotion, JavaScript, and everything in between to build a wide variety of web a mobile applications.

This entry was posted in Designing for Users and tagged . Bookmark the permalink. Both comments and trackbacks are currently closed.

2 Comments

  1. Posted October 5, 2010 at 12:41 pm

    Great post, the user interview process is something I’ve been thinking about a bit lately and you post is full of useful tips.

    I wonder if you have had any experience with the 9 boxes technique (http://dnicolet1.tripod.com/agile/index.blog/1765142/nine-boxes/). We haven’t used it extensively, but I imagine it could “keep the script simple” during user interview.

  2. Posted October 5, 2010 at 12:41 pm

    I like to use a non-directed style of interview. The questions start with “how”, “when”, “why”, “where”, and “tell me more about…”.

    This approach prevents dead-end responses.

    Read Indy Young’s book “Mental Models” for advice on how to collate and understand interview data.