Back

Digital First Media - User Testing

Complete with a test room and an observation room separated by a two-way mirror.

Build a company-wide User Testing Program

Digital First Media owns over 800 websites and apps throughout the United States. As the UX Art Director, I was part of a small team given the task to coordinate usability testing on a couple of their small sites. By increasing outside interest in the tests, we was able to build a company-wide User Testing Program. The program has influenced a shift in company culture as many teams are now integrating user-centric design principles into their workflow.

We set-up a versatile user test lab in order to conduct a wide variety of test types, including desktop and mobile usability testing, focus groups, card sorting and more.

Our observation room allows team members to watch the test while also seeing the user's screen and webcam to observe mouse movements and facial expressions.

Moderating a test requires the ability to make a participant feel comfortable and speak openly. Once that happens, the moderator should focus on listening and observing.

Building a Lab

To conduct weekly testing, we wanted to establish a dedicated space to use as a lab. We began researching locations within the building and discovered a meeting space that was initially built for focus groups, only to be converted into a training room a year after construction. The space consists of both a test room and observation room, separated by a two-way mirror. By re-purposing equipment and furniture from elsewhere in the building, we were able to create a versatile testing space for almost nothing. With lightweight, movable furniture, we are able to reconfigure the room to accommodate usability testing, focus groups, card sorting, and a variety of other test set-ups.

The Stakeholder Interview

All testing begins by having a conversation with the team(s) involved to obtain a thorough understanding of the project requirements and questions or problems that they have encountered. This information is used to determine subject matter for testing along with the type of test to conduct. After the test, We meet with the team again to compare the goals of the test with the outcomes and determine any actionable items or areas that require further testing.

Types of Testing

To Build a well-rounded testing program, it requires the ability to work with stakeholders to identify the questions that they are trying to answer through testing. By understanding the overall project goals, we can determine the type of test that will yield useful data. Some of the test types we have conducted include Moderated Usability Testing, Competitor Analysis, Prototype Testing, Card Sorting, Tree Testing, and Clickmap Analysis. We are also working with Adobe to implement Adobe Target, their A/B and Multi-variate Testing tool.

Development of the User Testing Program

Process Summary

Prior to developing the User Testing Program, testing had not been done on the Denver Post Website. Since this was a new initiative, we wanted to establish a baseline regarding the site's user experience. To do so, we conducted a series of usability tests. The goal was to gather overall opinions of the site design as well as test whether or not users could complete key tasks set forth by the editorial, sales, and development legs of the company.

Test Process

A total of 25 test sessions were conducted. Each session was recorded and streamed online via web conferencing to include team members outside of the Denver Office. Testing focused on task analysis and user interviews, in order to determine pain points in navigation, along with users' impressions of site layout and content.

 

To increase stakeholder participation in the test process, we used an iterative process when deciding tasks to test. After each round of testing, we would review the data received and work with stakeholders to determine areas that required more testing. Sample tasks included asking users to post a classified ad, find an article specific to the area they live in, and walk us through their daily routine on the site (for existing users).

Observations and Outcomes

A number of useful observations were produced through testing, here are two of the key findings and the benefits they provided the company:

Ineffective Ad Placements

Participants commented that the advertising was having a negative effect on their experience with the site. Most of them commented that they understand the necessity for ads, but that the site had breached their tolerance level. We utilized footage of these tests to bridge the gap between the editorial and sales departments and have them work together to find a solution. A full audit of all ads on the site was conducted and it was discovered that in one month, the company had spent over $400k on remnant ad placements that used site space, but did not generate revenue. The company is working to remove these ad places, which will reduce page clutter and generate a savings of almost $5 million dollars over the next year.

Unpopular Site Content

We asked participants to identify elements on our pages that caught their attention or seemed interesting.

After a series of tests we noticed that there were certain elements that received very little attention from users. We presented our recommendations to stakeholders, who conducted a review of the elements in question. After looking at the traffic metrics, they determined that our findings were correct and removed some of the elements. One of the widgets removed was costing the company approximately $500,000 per year. By eliminating this widget the company was able to save this revenue and help reduce page clutter, resulting in a better user experience.