Newfangled works with independent agencies to create lead development web platforms for their clients.

Newfangled works with independent agencies to create lead development web platforms for their clients.

How to Do Basic Website Usability Testing



First, Some Background Information

Before I explain how we tested and what the tests revealed, let me first share with you a bit of background on our website. In the summer of 2009, we began working on a complete redesign and rebuild of our site, which had last been re-launched in 2006. It had served us very well for more than three years, but the growth our firm had experienced over that time—both internally and externally—provided a mandate for change. This time, we decided to treat the project in the exact same way we'd treat a client project; we actually followed the planning process we talk about so often, defined a set scope of work, priced it without any kind of "hey, it's for us" discount, subjected it to our resourcing department for scheduling, and put Katie, one of our project managers, in charge. Mark and I, the "clients," approved the $60,000 budget (no joke, this kind of thing is serious business), and prepared to balance our dreaming with the occasional, realistic "no." And you know what? It worked out. We stuck to the scope, remained under budget (thanks to Katie's stellar management and everyone else's self-control), and produced a brilliant new site that did everything we hoped it would. Oh, and we're all still friends.

By January, 2011, just a year after launching our redesign, we began to sense that there were still some things about the site that needed to change. We knew there were some new things we wanted to do that would require some work, but we also knew that there were probably just a few minor things that needed to be tweaked that we didn't even know about. After all, we use the site every single day; the chances that we'd notice the flaws that new visitors are experiencing are now very low, if not zero.

 

There Are No Perfect Launches

On that note, I'd like to pause for just a moment to make an important point about the web: There are no perfect launches. It's appropriate that we refer to launching a new website as "going live"; a "live" website begins to change radically and rapidly once it starts being used, which demands ongoing maintenance: bug fixes, design tweaks, functionality changes, and the like. Though we were anticipating a focused round of work a year after launch—what we often refer to as a "Phase 2"—we had been consistently maintaining our site since its relaunch, discovering things that we just would not have been able to anticipate until it began to be used. I say all of this to answer what I can imagine might be anyone's response to the kinds of things usability tests reveal: "Why didn't you build it that way in the first place?" The answer is simpler to say than it is to experience, but I assure you that the reality is sound. You can build a website with all the intelligence, care, earnestness and good intentions possible, but you will find that some of your decisions were good guesses that proved wrong.

One of the wonderful things about the web is that it is interwoven with the people who create and use it, making it an organic entity and environment that is just as unpredictable as the rest of our lives so clearly are. Our job is to continually adapt what we do and how we do it as we learn new things by observation—it's all a work in progress. So, moving on: usability tests will reveal better ways of doing things that you've already spent time and money doing differently. It's ok; that's exactly what they're for.

 

A Very Simple Process

We wanted to know how we could improve our site—we were serious about that—but we didn't exactly have in mind mounting a long and expensive process to get there. Fortunately, we had an established model to follow provided by our old friend, Steve Krug, of Don't Make Me Think fame. Krug's most recent book, Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems provides a fantastic overview of how to do usability testing in a way that provides fast and valuable results. If you're interested in usability testing and quality assurance methods for web projects, I'd heartily recommend the book; it definitely belongs on every interactive designer's bookshelf. After reading it as a team, we distilled Krug's advice down to a simple three-step process that can be replicated for just about any website. Before I specify the steps, though, I'd like to offer a caveat: This is a basic procedure; Krug provides and recommends more. On its own, it will certainly provide plenty of insight, but it could easily be expanded upon to test more complex functionality.

Step 1: Customizing the Test
Sitting volunteers in front of a website and recording them browse is not usability testing. Your first step is to plan the test specifically around the goals of the website you want to evaluate. The framework of the test we ran was pretty basic. First, we had our volunteers spend a few moments (one or two maximum) simply orienting themselves on the homepage, feeling free to narrate their observations as they did. We asked that they not click any links away from the page, but just observe what it contains (scrolling is OK) and then explain to us what they perceived the website to be about. While many volunteers may have difficulty comprehending the purpose of an especially tightly positioned site, the goal of the orientation is to observe how well the homepage communicates the website's purpose—even if the user may not understand what it means. For example, if a volunteer rightly concludes that a website offers "industrial plastics extrusion equipment," you could give the site points for clarity even if the volunteer doesn't know what "industrial plastics extrusion equipment" is.

The rest of the test should be based upon 3-4 specific tasks that you would assign to your volunteer. Each task should correspond to specific goals of the site and test the user's ability to complete those goals. These could include subscribing to content, registering for events, requesting more information, making a purchase, as long as they target the primary purpose of the site. For example, a test for a website like ours, which relies heavily upon a written content strategy and has corresponding sign-up calls to action, should include at least one task in which the volunteer must find a specific article and sign up to receive that content.

Step 2: Administering the Test
Ideally, your volunteer should be someone that has little to no familiarity with the website you are testing. That means that the designer, developer, planner, account executive, client, and CEO need not apply. Anyone else, theoretically, is qualified. By the way, since you're only going to be asking for about 10 minutes of this person's time, volunteer is certainly still an appropriate word, though it couldn't hurt to reward them for their effort—Starbucks gift card, lunch on you, something like that. And about those 10 minutes, try to have everything set up before your volunteer arrives so you can really keep it as short as possible. That should really amount to turning the computer on, loading the site in an uncluttered browser, and opening up whatever screen-capture software you've chosen to use to record the session—we've really enjoyed using Camtasia.

Once your volunteer is comfortable, go ahead and start recording. Take them slowly and clearly through each test question, starting with the homepage orientation, being careful not to lead them to any conclusions. Feel free to ask clarifying or expanding questions—things like, "What's happening here?" "Can you say more about that?" or "What was difficult about that?"—but nothing specific enough to transfer any suspicions or biases you already have. Remember, that's why you are not sitting where the volunteer is—you've lost the objectivity they have. It helps to have a hard copy of your test questions on which you can write notes during the test. Even though the video will capture everything your volunteer does, it may not clearly demonstrate other things you observe. If your volunteer gets stuck for too long on any of your 2-3 tasks, that's probably an indicator that something isn't working—either with the site or the test—and it is probably a good time to move on. You'll find that the briefer the test, the more productive.

Step 3. Evaluating the Test Results
On its own, observing your volunteers will likely produce plenty of insight as to what is not working on your website. But as the process itself is controlled and systematic, it doesn't hurt to be a bit more methodical in your evaluation of the data you've gathered. Each of your testing sessions should have produced a roughly 5-minute video clip. After you've administered all the sessions (somewhere between 5-10 would be wise), go through each one and assemble a list of takeaways from each task, then consolidate the points from each session into one master list. Once you've done that, I'd recommend assembling your development team—all the people I said were not qualified to be volunteers—to watch the videos, review your master list, then brainstorm on solutions together.

This simple process is likely to generate plenty of feedback and opinions, not to mention work you'll need to do on your website. It certainly did for us. So what I'd like to do now is share with you the tasks that made up our usability tests, an example video clip for each one, and the points of critique we pulled from these sessions. Since we have yet to make any changes to our site in response to the usability testing, our existing site, coupled with the data we gathered, should serve as a good case study for how this process can work.

(Thanks again to Kaley Krause, Strategist from BlogAds, who served as the volunteer in the video clips below.)

 

A Case Study: Newfangled.com

Homepage Orientation
Kaley was extra thorough in her review of our homepage, and unlike just about every one of our volunteers, did notice our positioning statement (at about 00:50). But the fact that so many of our volunteers missed the positioning statement clued us in to the need to make that information more obvious. We plan to redesign it—in particular, enlarge it—and move it up on the page.

Other volunteers struggled with the main slideshow that stretches across the top of the page. Those that began reading the slides complained that they refreshed too quickly, leaving too little time to process the message. Additionally, because we're using the slideshow to promote short-term and long-term initiatives, some users were either more or less inclined to pay attention to it depending upon how familiar they already were with the site. Those who had seen it before and recognized a slide assumed that the slideshow never changes and ignored it in their session. Those who were new to the site assumed the slideshow's content was the most important thing on the page, and therefore payed closer attention to it than anything else. What we really want is something in between. Incidentally, the volunteers who did pay more attention to the slideshow were also less inclined to scroll further down on the homepage and explore the rest of its content. Not ka-blammo, as the kids say.

The key points from the Homepage Orientation (from all of the sessions we did, including Kaley's) were:

  • Make our positioning more clear.
  • Rethink the slideshow.
  • Get more content above the fold.

Task 1: Locating and Engaging with Content
Kaley headed straight for our search tool to locate the article I asked her to find—my newsletter from last June called Storytelling is the Future of the Web—which was exciting to see in action. I generally assume that most web users tend to find content by browsing available menus, but several of our volunteers recognized and used our search tool first. While using it, Kaley also uncovered a minor bug (which she didn't actually notice herself): When a newsletter page appears in the search results, the link it provides is to the "View on Single Page" version of the article. We'll fix this so that the link directs users to the paginated, "friendly link" version, just like the one you're reading right now.

Kaley had little trouble commenting. She quickly scrolled all the way to the bottom of the page—something that several of our volunteers struggled with, especially on pages with longer comments strings. However, she did uncover another minor detail that we should fix: When she clicked into the comment form fields, the field description disappeared. Since she had clicked into each of them before filling in her information, she was briefly confused about which fields were which. I can imagine that plenty of users might forget what each field label was if they did the same thing. She had no trouble using the sidebar form to subscribe to our newsletter, though.

Finally, some users experienced confusion after submitting forms and being directed to their specific "Thank You" pages. We realized that we could simplify the process by having the forms process without forcing the page to reload. That's now on our to-do list as well.

The key points from Task 1 were:

  • Fix the search link and comment field bugs.
  • Relocate the comments form to the top of the comments string.
  • Get rid of the "Thank You" pages and have forms submit and complete without refreshing the page.

Task 2: Locating and Registering for Webinars
While Kaley made the correct assumption that the top webinar listed on our webinars landing page is the upcoming one and the others listed beneath have already occurred, most of our testing volunteers were confused here. The text indicating the date of the upcoming webinar, as well as the "register" link, are both small enough to be regularly missed by users. Additionally, Kaley uncovered a bug I'd never seen before. When she added her email address to the simple form on our webinar detail page and then clicked the "Register" button, she was redirected to our basic company contact form. When I reviewed the video, I noticed she clicked the button twice, which might have something to do with it. In any case, I've never seen that happen before, so we'll have to look into it. But once she went back and hit the "Register" button again, she—along with every other volunteer—was confused by being redirected to GoToWebinar's offsite registration form. She did successfully complete a registration, but the process revealed plenty of kinks to work out.

The key points from Task 2 were:

  • Redesign the webinars landing page to make the upcoming/previous distinction more clear.
  • Emphasize the "register" call to action on the landing page.
  • Eliminate the redundancy of the Go To Webinar registration form.

Task 3: Agency Projects? What Agency Projects?
Finally, I asked each of our volunteers to find our most recent project done in partnership with an agency and its price range. Every one of them was confused by this. Kaley put it well just seconds after hearing the task: "I wouldn't know whether to click Agency Partnership or Featured Projects." She started with our Agency Partnership page, which unfortunately, didn't give her what she was looking for. Her next logical step was door #2—the Featured Projects page—which she browsed for some time before wondering if she still might be in the wrong place. Unfortunately, she was in the right place—it's just not designed in a way that emphasizes the agency relationship.

Kaley's session, as well as those with the other volunteers, made clear that we need to do a better job interweaving the information on our Agency Partnership, Featured Projects, and Pricing pages. As each is so directly tied to our positioning, it's problematic that they are currently so confusing to use and yield so little clarity to the user.

The key points from Task 3 were:

  • Functionally tie together our Agency, Featured Projects and Pricing pages.
  • Redesign the Featured Projects landing page to be more user-friendly.
  • Emphasize the agency partnership on case study detail pages.

 

Post-Test Debriefing

Each volunteer test session revealed so much to us that the excitement of learning something new quickly eclipsed any possible feeling of defeat we might have experienced as a result of realizing our website was not perfect.

We reconvened the same project team we used for our last redesign project in order to review the findings and discuss a plan of action, which, as you might imagine, turned out to include many other changes in addition to those I've shared with you here. Again, we're treating this work just as we would any other client work, and having already come to a consensus on the scope and cost—we've estimated it at around 10% of the initial project cost, which is on the low end of the expected range of first-year/phase-two costs for a site like ours—we are currently working out the schedule with the goal of completion by late fall, 2011.

Comments

Mike | June 7, 2011 3:48 AM
Hey, this is a good case study, i am doing a study myself with real people for some projects. if i would do the test on the form here, what i miss are real labels like: name, email, url, comment... when i click inside the form the description is gone and i when i get distracted while writing this and come back to the form i dont know what to write inside... ;)
Christopher Butler | May 17, 2011 10:50 AM
Ellen: We haven't used any heatmapping software…yet. But, as to your question about the value of the webcam portion of the videos, I think it is helpful to have. We record the videos at actual size (the embedded versions, are, of course, smaller), so we do get more detail that way. It also helps for general cues, like matching their eye movement and/or other expressions to what is happening on the screen they're using. That general information helps to explain long pauses in mouse-movement or page scrolling, or something else that might be hard to explain without it.

Jacob: Very cool! I'll add this to my list of tools to try out. Thanks for letting us know about it!

Ryan: I'm so glad to hear that this was a helpful resource to you—that's our main goal. Thanks for stopping by and saying so. Keep me posted when you get a site up!

Ryan Bentin (Not online yet) UGH | May 17, 2011 2:18 AM
I am in the beginning phases of building my website and am far from this phase. With that being said, this article was perfectly motivating and helpful. How simple yet so effective is this process? Thank you so much for providing this to us, knowing this method eases my mind on some things. Oh, and I will be purchasing Steve Krug's book on user testing because of this article.
Jacob Creech | May 5, 2011 10:30 PM
Interesting post, really well written and very insightful!

I have a handy tool to add to your list (although since I work on it, I'm probably a little biased) http://intuitionhq.com

Basically, you upload your designs, create tasks and then send it out to whoever you like via email, facebook, twitter or any other medium that suits. It then generates a heatmap based on their interactions with your site.

It's quite good because you don't even have to get people outside of the comfort of their own home (or office), and requires very little in the way of setup for you. Since you can see the results in realtime, you can also make changes on tweak your designs on the fly. It works well for us, anyway.

Thanks again for sharing; always interesting seeing how people come at these sort of tasks.

Cheers!
Ellen | April 28, 2011 9:22 PM
How helpful is the webcam video in the long run? It's pretty small and difficult to really get much from the facial expressions of the person using the site. Have you used any heatmapping software like in the Nielsen article I'm linking to?
Christopher Butler | April 28, 2011 8:33 AM
Jeri Glad you enjoyed it. Thanks for saying so!

Brett: We'll certainly keep it coming. I'm glad to hear you're willing to spend the extra time reading through longer articles like this one. Thanks for your comment!

JT: I'm glad you said that—this is an intentionally simple approach. We've noticed that usability testing tends to just not get done much once a site is launched, though that is increasingly the best time to do it. So rather than pushing a very complex approach on our clients, whom we already know are having difficulty doing everything they know they should do, we felt it was important to start small and go for the low-hanging fruit. As it turns out, doing that is, on it's own, incredibly productive. And that's the whole point, I think: that if a 15-minute usability review can produce these kinds of helpful and actionable insights, we have no excuse not to do them!

As for your second point about recruiting more in-the-know testers—I'm not sure I agree. I am sure there are cases, which, as you say, might require a level of familiarity the average person probably doesn't have in order to properly assess the effectiveness of an application, but I think basic usability (again, we're going for a simple approach here) is something common to all applications. Anyone who regularly uses the web should be qualified. As I mentioned in the article, anyone too close to the website—the designer, developer, project manager, etc.—is not unfamiliar enough to do this well. However, they'd make great observers of the tests.

Oh, one last thing about improvisation. That's a great word for this. Lauren Walstrum just posted some tips for usability tests like these, and I think the subtext of all of them is flexibility. Going in to even a simple process with a set plan is a good idea, but I found through doing a series of them that improvising the follow up questions based upon what I was observing was key to extracting the truth about what our volunteers were experiencing. That may not be the most scientific thing in the world, but at this point, I'm content with a testing process that is one part science, one part art.

Thanks for sparking a convo about this.
JT | April 27, 2011 8:45 PM
I like the presentation, so don't get me wrong when I say this seems a little simplistic. I'm having trouble seeing how this process would scale and be at all adequate to something more sophisticated, like an app or a site with thousands of users.

I've not been involved in much professional user studies, but I'd guess that the more complex the application (and I realize you're talking about less complicated sites here), the more the user groups doing the testing would have to be atl least familiar with what the application is about generally, like industry norms and such. This primer feels a bit more improvised than all that...Am I right?
Brett Wilson | April 27, 2011 1:06 PM
I'm always impressed with Newfangled - you never let me down. In the course of a busy week, I love taking time to slow down and see what fantastic insights you have to share - in a relevant and helpful way. Thank you - keep it coming!
Jeri Hastava | April 27, 2011 12:16 PM
Great case study. Thank you!
Christopher Butler | April 27, 2011 9:47 AM
Alex: Glad you enjoyed it! The testing process was actually a lot of fun for us, too.

Alan: I really felt that using ourselves as a case study—especially because we haven't yet implemented any improvements—would really help to make the case for doing testing like this and being expectant of similar results. Thanks for reading!
Alan Beberwyck | April 27, 2011 9:11 AM
Thanks for sharing a simple process for conducting a very enlightening exercise – this should be required learning for anyone working on websites. I appreciate your transparency and honesty in sharing the perceived shortcomings of your own site – made the content more engaging and credible. Well done.
Alex | April 26, 2011 5:51 PM
Great read! Loved the story feel, the title, the intro, and the image. They all definitely pulled me in to something I would probalby not ordinarily be that interested in. The videos were a nice surprise, too. Was nice tosee the process inaction.

↑ top