Quality Assurance (QA), according to the online Project Management glossary, is defined as "a planned and systematic pattern of all actions necessary to provide adequate confidence that the item or product conforms to established technical requirements." In other words, QA is a standardized method that ensures that everything works as it was intended to work and looks as it was intended to look. QA for the web should include an initial site-specific test plan, a round of browser testing, and a generous integration phase during which the client can evaluate functionality while entering content.
Quality Assurance Test Plans
Every website built around a content management system (CMS) will have a significant amount of common functionality that will require testing. This kind of general evaluation might include anything from testing the CMS login to exporting website form data, but is not primarily concerned with how things appear visually. Of course, if something is radically out of place visually, it should be noted. However long the testing list, this first step should be to identify any flaws in the standard operation of the site and CMS and can probably be performed once the whitescreen is complete and before design application has occurred.
The second step is to evaluate functionality specific to the website. Again, this stage is less concerned with how things appear visually than with how things operate. For example, an e-commerce website's store should be thoroughly tested with every relevant combination of products, accessories, and discount codes to make sure that even the most minor variable isn't overlooked. A site with a large content database that relies heavily upon an advanced search tool should be tested by running a large number of various search queries. A site with complex form options should have many test form submissions sent covering all options and combination of options, and so on. While the test plan should be drawn up by the team--naturally, those most familiar with how the site is intended to function--the person performing the test plan should be someone familiar with the technology, process, and purpose of QA, but new to the particular project being tested. Even if you don't have a dedicated QA role, you should endeavor to have fresh eyes on the site for this stage of QA.
For a more in-depth look at what the QA role should look like, read Nolan Caudill's blog post on The Most Important Person on the Team.
Realistically, there is going to be some overlap between the test plan and browser testing. Common site functions, like form submissions, for example, can have unpredictable issues in different browsers. But once the site functionality has been thoroughly vetted, the site needs to be tested, page by page, in every browser you officially support.
Browser testing can be done in a variety of ways, including running multiple physical machines, running virtual machines locally, running a centralized virtual machine server, or using a third-party testing service. We've been experimenting with using CrossBrowserTesting.com, which can test browser performance and some interactive functionality for any URL you submit. It can even generate screenshots of any URL in every current browser simultaneously (I ran this test on my blog page, which you can see in the image above). Plans range from as low as $19.95 per month to as high as $199.95 per month, but the value of being able to quickly check against the particularities of multiple browsers is well worth the cost.
By the way, despite being nine years old and out of compliance with today’s web standards, Internet Explorer 6 is still being used by a considerable portion of the population. While tools like CrossBrowserTesting.com make it easy to check out how a website looks in IE 6, they don't change the fact that it often requires a lot of extra effort to make a current site look and function correctly in it. At this point, it doesn't make much sense to do more than ensure that websites degrade well to IE 6; guaranteeing perfect performance in it is a losing battle. Even our site isn't perfect in IE 6!
These screenshots of my blog page (click image to enlarge) were automatically generated by CrossBrowserTesting.com. At first glance, they look pretty close, but there are some slight spacing differences in the page as viewed by Internet Explorer 6.0.
Integration is QA
Once the test plan is complete, the site is probably ready for content entry (we'll cover this a bit more in next month's article). Remember, this often happens before the design has been applied, so it's important that the client is prepared to see and use a work in progress. Though it's not officially considered a QA method, I believe that content entry, or integration, is one of the most effective and important QA efforts for any project.
Typically, integration is the point in a project when a client is able to fully experience the reality of their site for the first time. While they have worked closely with the team on prototyping and design, the process of using the CMS to create and enter content is when all the "dots" are connected and made real, and often the first point at which expectations are clarified. With that in mind, here's some straight talk:
No matter how thorough the prototype is, sometimes there are concepts or needs that cannot be communicated until you are immersed in an actual working and producing environment. This is similar to the "blank-slate-shopper" phenomenon: Have you ever seen a review of a book and thought that you'd like to purchase it, only to find that the next time you are actually in a bookstore you have no idea what you want or where to start? This is because we tend towards reactive rather than proactive thinking. We hear about a book and react to it with, "Yes, I'd like to read that," yet when we get to the store and are surrounded by thousands of books, we react to them all by drawing a blank. In web development, things are reversed a bit. Prototyping can be like the store, offering lots of attractive options unrefined by the future reality of how a site will be used. "Yes, I'd like my site to do that!" But integration will always catch the flaws in a site, be they many or few, because users will quickly react when they can't do what they expected to be able to. "Sure, the slideshow is nice, but I need to change the address in the footer!"
Finally, QA does not ensure that a project will be 100% bug free. While some bugs are due to sloppiness or haste and can be quickly identified by QA, others are the result of unforeseen functionality conflicts that may not become evident until a site has been used for a while--despite the best intentions and foresight of the team. As with any development project, bugs like these should be expected and encountered with patience. (Need I remind you of how buggy some expensive operating systems are when they launch?) While we hope that the various steps of QA will mitigate the frequency of any bugs occurring, we are definitely not surprised when they show up.
Coming Up in Part 2...
It's not done yet! Next month, we'll look at the last two steps of a web project, content entry and going live. Once a site is live, it begins its real life, so we'll also cover content strategy and nurturing for the future of a website.