When I started at Atomic, I had never participated in or conducted a usability test. In true AO teach-and-learn fashion, I’ve since then been a part of user testing on all of the major projects that I’ve been on since I started almost a year ago.
I have a few concrete take aways that I would want to hear if I were going back and learning about usability testing all over again.
1. Sit in on a test first.
This is probably obvious. But be sure to sit in on a test first, and not as the lead facilitator. See how your teammates run tests, and be there to ask supporting questions and take notes. I recommend pen and paper notes so that you don’t intimidate the tester with the clicking of a keyboard after every comment.
2. Start to prepare your script at least one week ahead of time, and build in time to rehearse.
It takes longer than you might think to create a script for a test. At AO, we have a few templates that we use as a starting place, but the context of each project is always different, so even the basic intro text will need rework.
3. Work with a pair.
As you start becoming more comfortable with the idea of leading a test, (or heck, if you have to jump right in and lead a test whether you’re ready or not) work with a more experienced pair to coach you. In fact, working with a pair whether you’re new to usability testing or not is a great way to establish a good dynamic between you and the user. Your teammate can act has the note taker and support role, asking questions and guiding as needed. They should help give you critique after each test. For example, this partner should give you feedback on if you are asking too many leading questions, are you reminding the user to talk out loud enough, etc., helping you to improve your facilitation skills.
4. Practice internally, with someone who is senior to you.
Once you’ve got a script drafted and you’re feeling pretty good about it, practice internally, at least a day before the real testing begins so that you have time to adjust your script and identify gaps.
When possible, practice the script on someone at your company who might be a good fit for the user set (this sometimes just won’t happen if your personas and users are really specific). I also suggest testing on someone who is senior to you — a little pressure is good in this case. A senior teammate can give productive feedback; it won’t help as much to only practice with someone who isn’t as well versed in user testing, and they won’t be able to coach you or identify holes as well.
5. If something feels awkward in the script, it will be awkward in real life.
When you are drafting your script or practicing with someone, you will likely start to feel transitions that are funny or comments that are just plain awkward. Unless you really, really need that information to be communicated to the tester, then eliminate it. It will only be more awkward with a real user in a real test.
6. Make a script, but don’t feel bound by it.
A script is there to keep things on track and make sure each session is relatively consistent with the other sessions. For example, the tasks should stay the same, and the overall introduction should too. But whenever possible, shoot for a professional yet relaxed conversational tone over reading off of the script. This is hard to do, especially if you aren’t the type who loves to speak in public and may get a little nervous. But the more tests you run, the more relaxed you will be, and this will help the testers feel comfortable and relaxed too.
7. Set the same expectations for all the testers.
The experience of a usability test starts with the initial communications you send to potential testers. I came across a scenario recently where our team had direct communication with all of the testers to arrange the sessions, except for one. That one user missed the context, purpose, and timing for our meeting and that session was noticeably more challenging than all of the others. We unfortunately did not have control over this, but as much as possible, try to communicate directly with your testers, or create a template invite for other coordinators to use. This leads me to my next point.
8. Stay calm.
For the three rounds of user testing I’ve participated in in that last ten months, I’ve met with a handful of users for each test. Almost all of the sessions went mostly as predicted — we followed the script, introduced the product, ran through scenarios and tasks, and finished in the allotted time.
There was one user though that did not go as planned. Nothing terrible happened, but there was some confusion about the setup of the meeting and why they were helping us that day. The user was flustered and a bit thrown off. In this situation, we weren’t able to follow our script from the beginning, and that’s fine. But I had to really remember to sit quietly and listen to what this person had to say, even though we were way off of the script and the main goals of the session in an effort to keep the tester comfortable. I hadn’t really been prepared for an interview to go off track like this, but we listened, nodded, and eventually guided the conversation back on track.
9. Recap briefly as a team.
Try to reserve time, even just a couple of minutes, after each test to sync with your team and run through high level takeaways. This helps in knowing — right away rather than hours or days later — if everyone was hearing similar things from that user. When you have time, go back and compile notes and a deliverable of action items and top concern areas.
I’ve learned a lot in the last 10 months at AO, but I think usability testing and facilitation is one of the most interesting practice areas where I’ve been able to gain a concrete skill set that I’m excited to improve upon.
Testing your product with users identifies gaps, challenges assumptions, and validates decisions. It connects designers, developers and clients directly with the needs and concerns of real users, which is the key to a successful new product.