Usability Testing Responsive Design (case study)

Keith Doyle describes a WhatUsersDo usability study he carried out for a responsive website. Keith is a freelance information architecture trainer and consultant trading as Navopia.

I have been collaborating with Paddy Callaghan at the University of Bradford to develop patterns for a responsive redesign of their undergraduate study website. Paddy’s blog describes the study from the University’s perspective. In this article, I will focus on the issues specific to responsive websites.

Background to the Study

In the University’s recent plans to update their undergraduate section, they wanted to make the website responsive. They aimed to create a better user experience on mobile devices and tablets. The strategy was to apply high-priority responsive design changes, then carry out a small usability study to identify the most important usability issues and then do something to improve the usability for each of the top issues identified. The primary focus of the study was to investigate the information architecture – how easy is it to navigate around the responsive website?

Definition of a Responsive Website

A responsive website is one where a web page uses modern browser technologies to provide different layouts depending on such factors as the width of the screen. Normally, desktop users will not notice any difference, but smartphone users will get the impression that the page has been designed for mobile – something which participants in this study commented on, saying such things as, “this page has been designed for mobile“. Unlike a mobile‑designed site, the same web page is viewed on both mobile and desktop, removing the need to provide different pages for different devices.

How WhatUsersDo’s Mobile Testing Service Works

Participants are given tasks to carry out. Because it is a remote test, the tasks set have clear starting pages and are very clearly written with very specific instructions. The WhatUsersDo mobile user testing solution records smartphone and tablet owners by using a camera (so that hand movements can be seen) and a microphone (to capture their spoken thoughts). The desktop users have their screen and voice recorded. At the end, the users can be given an exit question to answer online.

About the Study

The study consisted of five tests. As mobile design patterns are the newer parts of the site and as it was expected that the smaller screen sizes would present the greatest usability issues, the priority was to test on mobile devices. We therefore commissioned three mobile tests, one tablet test and one desktop test. The tasks set mostly involved known‑item searching. For example, we wanted to find out:

  • How participants would go about finding out how many UCAS points were required for a named course.
  • How they would navigate to specified information about accommodation.

About the Participants

All mobile users held their phone in portrait mode. The tablet user held their iPad in landscape mode. All the users on mobile devices were confident in zooming and panning when navigating in other parts of the University site which are not yet responsive. Users were connected to the internet through Wi-Fi, so speed was generally not an issue in this test. One user did experience speed issues, but this seemed to be related to issues with the user’s WI-FI connection rather than with the University website.


Time on Task

The average time for the mobile users to complete the tasks was 30% longer than the average for desktop and tablet users. It seems logical that mobile users will on average take longer to undertake tasks as they lack the context of the desktop banner and left-hand navigation panels, and lack the immediate availability to click on navigation options.


At times, navigation was more difficult on a mobile device because not all the navigation options were visible at one time on the smaller screen. However in some cases the mobile view helped navigation because the local navigation was more visible than the global navigation, encouraging users to stay within the correct section.


There were some smartphone usability issues related to proximity of content, location of content, and the unavailability of tooltip information for contextual help. The usability study helped to identify and prioritise these.


There was a technical issue which did not cause any usability issues for users: the width of a table on one of the pages led to a small amount of horizon scrolling on a smartphone. The WhatUsersDo study enabled us to decide that this was a lower priority usability issue.

Technical Issues

The study identified three other technical issues on the site which could be quickly fixed.

Study Issues

The screen recordings were good enough to analyse the videos. One voice recording was fairly quiet, and one of the smartphone users a little shakey with their hand. However, replacement videos are available if the recording is not of sufficient quality to analyse.

As a pilot test, the numbers were low for this study. In terms of recommended participant numbers, it partly depends on whether a one-off study is being carried out, or a series of smaller studies. It could be worth considering testing fewer tablet devices than smartphones and desktops, as smartphones will test touch-related issues, and the tablet landscape view is often similar to the desktop view in a responsive site.

Conclusions and Recommendations

The study identified some of the key usability, information architecture and technical issues on the University of Bradford undergraduate website. It demonstrated that when remote testing a responsive site, both the mobile and desktop views should be tested, as usability issues were identified in each and in both views.

The findings in tablet view would probably be uncovered in the desktop view (for layout issues) and mobile view (for touch issues). However, I would still recommend including some tablet testing as there may be issues which the study did not identify due to its small sample size. The exception to this would be if the site being tested included a tablet-optimised view – although in general responsive sites are not designed for specific devices but rather for their capabilities.

For the University of Bradford, the test not only revealed areas for improvement, but validation (from real users) of why they are using responsive design in the first place. This video, of a user’s closing comments, says it all:

ARVE Error: id and provider shortcodes attributes are mandatory for old shortcodes. It is recommended to switch to new shortcodes that need only url

Recommendations for Testing Responsive Sites on WhatUsersDo

  1. It is best to test about the same number of smartphone and desktop users, but consider testing fewer tablet users, unless you have a specific tablet optimised site.
  2. It does not seem to be a problem for smartphone users if part of a website is responsive and part of it is not, so it is not necessary to avoid non-responsive parts of the site in the task list.
  3. Have an idea of how much longer it will take to carry out the tasks on smartphones compared with tablets and desktops.
  4. Look out for issues which desktop users have and smartphone users do not have, as well as the other way round.

No Responses to “Usability Testing Responsive Design (case study)

  • This case study is very great. I’ve had some tests made and so far have achieved good results. With the right tools for usability testing, we can maximize the results we can achieve and improve our marketing efforts. I have actually seen some tools for usability testing at and find them really nice to use.

Leave a Reply