Your website needs to work on many different browsers and to ensure that it works you need to make cross-browser QA a standard part of your process. This is the first of a multi-post series addressing how to setup cross-browser QA and make it as easy as possible so that you can easily integrate it with your existing processes.
Which Browsers Should You QA?
Before you can conduct a meaningful QA, you need to identify which browsers you need to review your site on. A good place to start is with the browsers that people are currently viewing your site with.
Look at the last 30 days of data in your analytics suite and make a list of the Browsers and Versions that represent at least 5% of traffic. You can get this list in Google Analytics by:
- Navigate to Audience -> Technology -> Browser & OS
- Select “Browser Version” as the Secondary dimension
- Click the Pie chart button for percentages
You can even save a link to this report with a shortcut.
In general you can ignore the myriad versions of Chrome and Firefox and just focus on testing the latest versions of each. However, if you have significant traffic from both Macintosh and Windows you should test Chrome and Firefox on each platform as there can be subtle differences.
You should definitely review any browser with more than a 5% share of visitors. You can go as low as 1% depending on your business needs and how many actual visitors are using a particular browser.
Standard Browser List
A good standard browser review list is:
- Latest Chrome on Mac and Windows
- Latest Firefox on Mac and Windows
- Latest Safari on Mac OS 10.7 or 10.8
- IE8 on Windows 7
- IE9 on Windows 7
- IE10 on Windows 7
- Chrome on Android 4
- Safari on iOS 6
These are likely to be your most common browsers but you should absolutely adjust the list as necessary based on your actual traffic.
Which Pages Should You QA?
Now that you know which browsers you’re going to QA, you need to figure out which pages to review. It’s likely that your site is too big to QA every page on it so, just like in Always Be QA’ing (ABQ), we need to develop a list of pages that area representative subset of your site; but this time we want to focus on page components rather than content.
When we’re doing cross-browser QA we are looking at how pages render and making sure that everything displays well so we can’t use the same metrics for generating a page list as for ABQ. Your homepage should definitely be on the list due to it’s importance in the visitor experience and frequent uniqueness of display. Beyond this, you need to identify pages that represent design or component archetypes.
Most sites are built with a standard set of components that are repurposed across pages (lists, boxes, tables, special treatments, etc). Try to identify pages that encompass as large a collection of these components as possible so that your QA can be more effective. If you can create Component Key Pages that is most effective.
You can also look your analytics to identify the most popular pages on your site. Again, keep in mind, we’re focused on making sure that the pages render correctly. In this case it makes sense to look at the pages that people are visiting most often and including them in your list.
With all of that in mind, create a list of no more than 8 pages to review. If you can get it down to 3-5 that will be even better because the more pages you need to review, the longer the QA will take and the less likely that you will do it. It’s better to have a small set of pages that you actually review than a large set that you ignore.
Document a Plan
You now have a list of browsers and pages. Open up your Word Processor or Text Editor or Evernote and document them. Every quarter you should review your browser and page list to make sure that it is still reasonable with your traffic.
Scheduled QA
Since most sites are fairly static and design changes are infrequent, you should be fine conducting the cross-browser QA at most once a month. Set aside a couple hours and spin through the pages in your identified browsers and make sure that everything displays correctly and that any JavaScript behavior works as expected.
Next week we’ll cover how to get setup for Cross-Browser QA on the Desktop.