Continuing our series reviewing browser testing tools we are having a look at CrossBrowserTesting, which allows you to check your website live across the major web browsers and operating systems and now also includes several mobile devices.
The main difference between CrossBrowserTesting and other browser checking tools like Mogotest or Browserlab is that with CBT you can actually test your website ‘live’ by using the browser instead of receiving screenshots or a report showing you what is wrong.
Of course, this additional offering doesn’t come cheap. Pricing starts at $19.95 per month for just 150 minutes of use. However, if you consider that the alternative could be to put together your own testing computers with all the operating systems and browsers you require then $20 a month might not be so bad.
Let’s see if CrossBrowserTesting is worth it.
CrossBrowserTesting is a paid service but there is a 7 day free trial with each package, which allows up to 1 hour usage for 1 person.
Unfortunately, in order to sign up, you do have to input your credit card details but if you cancel before the 7 days are up then you won’t get charged.
Once signed up you are dropped into your Test Center and have the option of running a live test, taking screenshots of a URL or installing the bookmarklet (more about that later).
As the main purpose of CrossBrowserTesting is the ability to run live tests, we’ll look at that first of all.
Clicking on ‘Run Live Test’ opens a new window with all the available operating systems and browser combinations plus the screen resolution can be picked from a list.
You can also view what additional software is installed on each operating system if you need applications such as Adobe Acrobat Reader, Flash, Silverlight, Quicktime, etc. This is extremely helpful so you can test the full website including animations, PDF downloads and video clips.
The availability of browsers and operating systems is pretty good with all versions of the main browsers available including Internet Explorer, Google Chrome, Firefox, Opera, Netscape and even older browsers going back to Firefox 1.0, Internet Explorer 5.0 and Netscape 4. You don’t have to test your website in these seldom used browsers but they are there if you need them.
Newer browsers are also present including Internet Explorer 9 RC and Firefox 4 Beta, not yet fully released but very useful to start testing your sites on.
I chose a Windows 7 OS that had Internet Explorer 9 RC installed at a resolution of 1024 x 768 in order to test the WebDepend website on.
The desktop loaded in pretty quickly and I double clicked on the Internet Explorer icon to get started straightaway. Browsing to www.webdepend.co.uk was also fairly quick, there is a slight delay as you type the web address into the address bar and it took a couple of seconds for the WebDepend website to appear but first impressions were that the system was fairly responsive.
There is an additional bar at the top of the screen, which shows the operating system you are using and also allows you to end the testing session, take a screenshot of the page you are viewing and record your sessions to playback later on.
I wanted to test the dropdown navigation on the WebDepend site, as this can’t be tested using screenshot applications, and so I hovered the mouse over each menu item to see how the navigation displayed once it had opened out. I did have to wait a second or two each time for the browser to catch up with where my mouse was pointing.
Using the scrollbar also took a couple of seconds to scroll to the part of the page I wanted to view and so I can see a lot of browser testing starting to become quite tedious in this respect (but browser testing can be fairly tedious anyway I suppose).
Another area that you can’t properly test with screenshot tools are enquiry or contact forms. I wanted to fill in an enquiry form to make sure the validation and submission worked correctly in IE9. The form was completed pretty swiftly and submitted fine, I received the test email within seconds so there was no delay in being able to test filling forms.
Ecommerce websites could make great use of CrossBrowserTesting to fully test the shopping process in each web browser and operating system configuration.
Whilst testing a website live, simply click on the snapshot icon and it will grab the screen that you are on and save it in your test center. You can then add notes to the snapshot plus what browser you were using at the time for future reference. A publicly accessible link to the screenshot can be shared on social networks.
One useful enhancement to this would be the ability to send snapshots straight into a bug tracking tools such as Unfuddle, Pivotal Tracker or Fogbugz and create a new bug. I would certainly appreciate being able to create new bugs in Pivotal Tracker, which we use at WebDepend, in this manner.
Once you have recorded a video you can watch it back and add notes plus the browser you were using for future reference, as with snapshots. A publicly accessible link to the video clip can be shared on social networks.
Fairly new to CrossBrowserTesting is the fact that you can test how websites appear on an iPad, iPhone or an Android 2.2 device.
Click on ‘Run Live Test’ as with the Live Testing carried out above but choose one of the mobile operating systems to launch.
For this test I chose Android 2.2, which has the Android 2.2 web browser at a resolution of 320 x 480. I waited for around 60 seconds whilst my configuration was being prepared but then an on screen error message appeared to say that it had not successfully launched, the technical team had been informed and to try again.
Once I did try again, after waiting another 60 seconds, a virtualized Android operating system appeared and I could launch the Android web browser.
As described above, there is a slight lag whilst the browser interprets your mouse click or keyboard action but it works pretty well and you can successfully browse the website you want to test, using the Android device including navigation buttons and keyboard.
You will probably want a greater range of mobile browsers and operating systems if you intend on doing a lot of mobile device testing but having iPhone, iPad and Android represented is a good start.
CrossBrowserTesting does also include automated screenshots with every pricing plan. In your test center click on ‘Take Screenshots’ and then input your URL including any advanced options such as whether a login is required or if you want to delay the screenshot by a couple of seconds.
Then select the browsers you need screenshots take for, you can customise the list or selecting all browsers will give you a pretty healthy list.
It is also possible to specify the resolution for each browser, set to 1024 x 768 by default or 640 x 960 for the iPhone 4. At present it doesn’t appear possible to have screenshots taken for an Android 2.2 device.
The final step is to click ‘Take Screenshots’ and CBT will start churning through all the screenshots and presenting them to you as they are taken.
Screenshots are taken pretty quickly with all 41 browsers returning a screenshot after about 3 or 4 minutes. I was running these tests at 2pm UK time, which is 8am where CrossBrowserTesting is based. Hence, all times are recorded as being 8am, which might confuse me when I come to refer to results and figure out when I was carrying out the testing.
Checking through the screenshots I spotted a small layout issue on the WebDepend home page when viewed on an iPad (I don’t own an iPad unfortunately and so hadn’t tested on this device previously) so the test highlighted that bug to me, which was useful.
But be careful as using the screenshots facility adds to the overall minutes used. I managed to rack up 31 minutes of usage in checking through the screenshots produced without realising.
Against each screenshot is a handy link to launch a live test session for that specific browser and OS combination so if you spot a problem you can quickly investigate in more detail plus test the rest of your website to find any further problems.
There is also a bookmarklet that allows you to take screenshots or launch a live testing session on any web page that you are visiting. This allows quick and easy access to starting your browser compatibility testing.
One of the areas not utilised as part of this review is the API, which allows you to automatically trigger screenshots of a URL and then email them to an email address to be checked. I suppose it could also enable you to send screenshots into a bug tracking system as described above.
At present, I believe only the screenshots functionality is included in the API but there are plans to extend that to the live testing side of the system too.
Overall, CrossBrowserTesting is a very useful service for any browsers or operating systems you don’t normally have access to. For me this would be several browsers including IE6, older versions of Firefox, Safari, Netscape, Opera and mobile browsers including the iPad, iPhone and Android 2.2. That’s quite a number and whilst I might not test everything on all the older browsers, I do want to make sure everything displays and works on mobile browsers plus some clients still require IE6 compatibility.
Another advantage I can see is for website developers and apps companies to react to users problems and try to reproduce the issue on the specific OS and browser combination that user has installed. For the smaller organisations they may not have all these combinations to hand and so a service like CrossBrowserTesting becomes essential.
I can also see scenarios where I am out of the office and away from my normal testing environment. Being able to test on up to 41 web browsers from anywhere I can get online is a huge bonus.
CrossBrowserTesting is very easy to use and get to grips with straightaway. There are little details that make browser testing easier, such as the ability to go into a live test from a particular screenshot you had taken. Having the bookmarklet and API add to the overall sense that the application is easy to use however you want to use it.
On the downside there are small delays in using the live testing so over a full session that means browser testing takes longer than it would if you were using the machine locally.
The main drawback is that minutes soon rack up, whether you are using Live Testing or Screenshots so the base package of just 150 minutes per month would get used up very quickly. I wonder whether the ability to purchase extra minutes or a pay as you go service would be something for the folks at CrossBrowserTesting to consider.
However, even with the slight negative points, using CrossBrowserTesting has to be less hassle, cheaper and far more convenient than getting together all the operating systems and browser combinations available plus the maintenance and space required to house all the equipment.
I certainly know what I prefer.