Newfangled works with independent agencies to create lead development web platforms for their clients.

Newfangled works with independent agencies to create lead development web platforms for their clients.

The Website, the Webcam, and the Test Plan: Simple and Exciting Website Usability Testing

Allow me to set the scene: I'm standing behind the first of several testing volunteers we've arranged to come and spend a few minutes using our website. He's sitting at my computer studying our homepage, which we've asked him to do for a minute or so without clicking any links. Lauren, one of our project managers who is guiding this round of testing, asks him to narrate his observations.

"Clearly Newfangled are web developers. That's easy enough...You guys have a lot of different clients."

So far so good, I'm thinking. We're so awesome. And our website is so rad.

"..If I was just coming to this site the first thing I would wonder is...well, every thing seems pretty specific, like, lots of little articles..."

True, true, we work hard. Thanks for noticing.

"...but if none of these interest me, I don't know exactly where I would click to just see a basic, um, "what we do"—even like a two line..."

Wait, what? Really? My sense that this would be a fun, expertise-confirming experience just evaporated. No, not evaporated—that sounds too pleasant. More like melted down, like the way that one bad guy's face does after he's covered in toxic waste at the end of Robocop.

Doesn't he see the positioning statement (two lines, by the way) right there in front of his face? Is he blind???

I guess that's what the earliest stage of failure feels like—when you haven't even realized you've failed. For a brief moment, you're the victim of a cruel injustice until you realize that, no, he's right. Our positioning statement is barely there. The most important statement our homepage could make, quietly tucked beneath the much louder and less critical slideshow—which, as it turns out, moves so fast that most of our volunteers couldn't process the messaging it contains. We were only a minute and twenty-five seconds in to our tests, and yet we had weeks of work before us.

I'm too close to my own website to judge it, and so are you.

All melodrama aside, website usability tests are not the kind of experiences everyone walks away from feeling like a winner, which is exactly why you should do them. They don't just expose the flaws and weaknesses of a website's design or construction; more importantly, they reveal the inability of anyone close to the website to accurately judge it's effectiveness. Though we had, of course, expected actionable feedback from the test subjects, that expectation didn't preclude a bit of a sting in actually receiving it. But these tests present a positive opportunity to learn what you are not able to see for yourself. If you can take that point of view, usability testing should produce excitement, not resignation.

This month I'd like to share with you the simple video-enhanced usability testing procedure we used on our own website, what we found, and what we're going to do next.


Mike | June 7, 2011 3:48 AM
Hey, this is a good case study, i am doing a study myself with real people for some projects. if i would do the test on the form here, what i miss are real labels like: name, email, url, comment... when i click inside the form the description is gone and i when i get distracted while writing this and come back to the form i dont know what to write inside... ;)
Christopher Butler | May 17, 2011 10:50 AM
Ellen: We haven't used any heatmapping software…yet. But, as to your question about the value of the webcam portion of the videos, I think it is helpful to have. We record the videos at actual size (the embedded versions, are, of course, smaller), so we do get more detail that way. It also helps for general cues, like matching their eye movement and/or other expressions to what is happening on the screen they're using. That general information helps to explain long pauses in mouse-movement or page scrolling, or something else that might be hard to explain without it.

Jacob: Very cool! I'll add this to my list of tools to try out. Thanks for letting us know about it!

Ryan: I'm so glad to hear that this was a helpful resource to you—that's our main goal. Thanks for stopping by and saying so. Keep me posted when you get a site up!

Ryan Bentin (Not online yet) UGH | May 17, 2011 2:18 AM
I am in the beginning phases of building my website and am far from this phase. With that being said, this article was perfectly motivating and helpful. How simple yet so effective is this process? Thank you so much for providing this to us, knowing this method eases my mind on some things. Oh, and I will be purchasing Steve Krug's book on user testing because of this article.
Jacob Creech | May 5, 2011 10:30 PM
Interesting post, really well written and very insightful!

I have a handy tool to add to your list (although since I work on it, I'm probably a little biased)

Basically, you upload your designs, create tasks and then send it out to whoever you like via email, facebook, twitter or any other medium that suits. It then generates a heatmap based on their interactions with your site.

It's quite good because you don't even have to get people outside of the comfort of their own home (or office), and requires very little in the way of setup for you. Since you can see the results in realtime, you can also make changes on tweak your designs on the fly. It works well for us, anyway.

Thanks again for sharing; always interesting seeing how people come at these sort of tasks.

Ellen | April 28, 2011 9:22 PM
How helpful is the webcam video in the long run? It's pretty small and difficult to really get much from the facial expressions of the person using the site. Have you used any heatmapping software like in the Nielsen article I'm linking to?
Christopher Butler | April 28, 2011 8:33 AM
Jeri Glad you enjoyed it. Thanks for saying so!

Brett: We'll certainly keep it coming. I'm glad to hear you're willing to spend the extra time reading through longer articles like this one. Thanks for your comment!

JT: I'm glad you said that—this is an intentionally simple approach. We've noticed that usability testing tends to just not get done much once a site is launched, though that is increasingly the best time to do it. So rather than pushing a very complex approach on our clients, whom we already know are having difficulty doing everything they know they should do, we felt it was important to start small and go for the low-hanging fruit. As it turns out, doing that is, on it's own, incredibly productive. And that's the whole point, I think: that if a 15-minute usability review can produce these kinds of helpful and actionable insights, we have no excuse not to do them!

As for your second point about recruiting more in-the-know testers—I'm not sure I agree. I am sure there are cases, which, as you say, might require a level of familiarity the average person probably doesn't have in order to properly assess the effectiveness of an application, but I think basic usability (again, we're going for a simple approach here) is something common to all applications. Anyone who regularly uses the web should be qualified. As I mentioned in the article, anyone too close to the website—the designer, developer, project manager, etc.—is not unfamiliar enough to do this well. However, they'd make great observers of the tests.

Oh, one last thing about improvisation. That's a great word for this. Lauren Walstrum just posted some tips for usability tests like these, and I think the subtext of all of them is flexibility. Going in to even a simple process with a set plan is a good idea, but I found through doing a series of them that improvising the follow up questions based upon what I was observing was key to extracting the truth about what our volunteers were experiencing. That may not be the most scientific thing in the world, but at this point, I'm content with a testing process that is one part science, one part art.

Thanks for sparking a convo about this.
JT | April 27, 2011 8:45 PM
I like the presentation, so don't get me wrong when I say this seems a little simplistic. I'm having trouble seeing how this process would scale and be at all adequate to something more sophisticated, like an app or a site with thousands of users.

I've not been involved in much professional user studies, but I'd guess that the more complex the application (and I realize you're talking about less complicated sites here), the more the user groups doing the testing would have to be atl least familiar with what the application is about generally, like industry norms and such. This primer feels a bit more improvised than all that...Am I right?
Brett Wilson | April 27, 2011 1:06 PM
I'm always impressed with Newfangled - you never let me down. In the course of a busy week, I love taking time to slow down and see what fantastic insights you have to share - in a relevant and helpful way. Thank you - keep it coming!
Jeri Hastava | April 27, 2011 12:16 PM
Great case study. Thank you!
Christopher Butler | April 27, 2011 9:47 AM
Alex: Glad you enjoyed it! The testing process was actually a lot of fun for us, too.

Alan: I really felt that using ourselves as a case study—especially because we haven't yet implemented any improvements—would really help to make the case for doing testing like this and being expectant of similar results. Thanks for reading!
Alan Beberwyck | April 27, 2011 9:11 AM
Thanks for sharing a simple process for conducting a very enlightening exercise – this should be required learning for anyone working on websites. I appreciate your transparency and honesty in sharing the perceived shortcomings of your own site – made the content more engaging and credible. Well done.
Alex | April 26, 2011 5:51 PM
Great read! Loved the story feel, the title, the intro, and the image. They all definitely pulled me in to something I would probalby not ordinarily be that interested in. The videos were a nice surprise, too. Was nice tosee the process inaction.

↑ top